Day-7 | Live AWS Project using SHELL SCRIPTING for DevOps | AWS DevOps project| #devops #aws #2023

Abhishek.Veeramalla
11 Jan 202328:11

Summary

TLDRIn this DevOps tutorial, Abhishek introduces a shell script project designed for cloud infrastructure management, particularly useful for tracking AWS resource usage. He explains the motivations for moving to the cloud, focusing on manageability and cost-effectiveness. The script, which can be integrated with a cron job for automated reporting, retrieves information on S3 buckets, EC2 instances, Lambda functions, and IAM users, enhancing organizational resource monitoring and cost control.

Takeaways

  • 😀 The video is part of a full DevOps course and focuses on a real-time shell script project used by DevOps engineers on cloud infrastructure.
  • 🏭 The primary reasons for moving to cloud infrastructure are manageability and cost-effectiveness, reducing maintenance overhead and allowing pay-as-you-go usage.
  • 💰 Organizations must track resource usage to ensure cost-effectiveness, as cloud providers charge for unused instances and resources.
  • 🛠️ The shell script project aims to report on AWS resource usage, such as EC2 instances, S3 buckets, Lambda functions, and IAM users, to maintain cost-effectiveness.
  • 📝 The script is a simple way to generate daily reports for managers, which can also be integrated with a reporting dashboard for continuous monitoring.
  • 🔧 The script can be improved with comments and print statements to enhance readability and provide better user experience and debugging information.
  • 🔄 The script uses AWS CLI commands to retrieve information about various AWS resources and can be customized based on the organization's needs.
  • 📜 The video demonstrates how to write and execute a shell script, including giving it executable permissions and running it to see the output.
  • 🔍 The use of 'jq', a JSON parser, is highlighted to simplify and parse the output from AWS CLI commands to get specific information like instance IDs.
  • ⏰ The script can be scheduled to run at specific times using a Cron job, ensuring that the resource usage report is generated and delivered on time.
  • 📚 The video concludes with an assignment for viewers to write and integrate the shell script with a Cron job, and offers support for questions in the comment section.

Q & A

  • What is the main focus of the video script?

    -The video script focuses on teaching viewers how to create a shell script project for tracking AWS resource usage, which is a common task for DevOps engineers working with cloud infrastructure.

  • Why would an organization move to cloud infrastructure like AWS or Azure?

    -Organizations move to cloud infrastructure primarily for two reasons: to reduce management overhead by eliminating the need to maintain their own data centers and servers, and to be cost-effective by paying only for the resources they use on a pay-as-you-go basis.

  • What is a common issue that organizations face when moving to the cloud?

    -A common issue is ensuring cost-effectiveness by monitoring and managing resource usage to avoid paying for unused instances or services, which can lead to unnecessary expenses.

  • What is the role of a DevOps engineer or AWS admin in managing cloud resources?

    -A DevOps engineer or AWS admin is primarily responsible for maintaining cost-effectiveness by tracking resource usage and ensuring that resources are being used optimally and not left idle or underutilized.

  • What is the purpose of the shell script project discussed in the video?

    -The shell script project aims to generate a daily report of AWS resource usage, including EC2 instances, S3 buckets, Lambda functions, and IAM users, to help monitor and manage cloud resources effectively.

  • How can the shell script be executed automatically at a specific time daily?

    -The shell script can be integrated with a Cron job, which is a time-based job scheduler in Unix-like operating systems. A Cron job can be set up to automatically execute the script at a specified time every day.

  • What is the significance of using comments in the shell script?

    -Comments in the shell script are important for providing context and understanding to anyone reading the script, including those who may not have scripting knowledge. They explain what each part of the script is intended to do.

  • What is the purpose of the 'set -x' command in the shell script?

    -The 'set -x' command is used to enable debug mode in the shell script. It causes the script to print each command that is executed, along with its output, which can be helpful for troubleshooting and understanding the script's operation.

  • How can the output of the AWS CLI commands be simplified to show only the necessary information?

    -The output can be simplified using 'jq', a JSON parser command-line tool. By piping the output of AWS CLI commands to 'jq', specific pieces of information, such as instance IDs, can be extracted and displayed, making the output more concise and relevant.

  • What is the final step suggested in the video script for handling the script's output?

    -The final step suggested is to redirect the output of the script to a file called 'resource_tracker'. This allows for easy access and review of the resource usage report by simply opening the file.

Outlines

00:00

🚀 Introduction to the DevOps Shell Script Project

In this introductory segment, Abhishek, the presenter, welcomes viewers back to his channel and introduces Day 7 of the full DevOps course. The focus is on a real-time shell script project commonly used by DevOps engineers in cloud infrastructure. He outlines the purpose of the project, its usefulness, and sets the stage for an explanation of how to create it. The video also delves into why organizations move to cloud infrastructure, highlighting manageability and cost-effectiveness as primary reasons. The importance of tracking resource usage to maintain cost-effectiveness is emphasized, and the role of a DevOps engineer or AWS admin in ensuring this is discussed.

05:01

📈 Understanding Cloud Infrastructure and Cost Tracking

The second paragraph delves deeper into the reasons for adopting cloud infrastructure, such as reduced maintenance overhead and the pay-as-you-go model. It discusses the challenges of managing resources in a cloud environment, particularly the need for oversight to prevent wastage. The script's role in generating daily reports on resource usage is introduced, with an example of how a company might monitor its AWS resources. The paragraph also touches on the importance of integrating such scripts with a Cron job for automated, timely reporting.

10:02

🔧 Setting Up the Shell Script for AWS Resource Tracking

In this segment, Abhishek demonstrates the initial setup for a shell script designed to track AWS resources. He explains the importance of using a widely recognized scripting language like Bash and the necessity of having the AWS CLI installed and configured. The video shows the process of creating a script file, adding a shebang line, and providing metadata about the script, such as the author, version, and purpose. The script's basic structure is outlined, with comments indicating the types of resources it will track.

15:04

🛠 Writing the Script to List AWS Resources

The presenter begins writing the actual script, starting with commands to list AWS S3 buckets and EC2 instances. He explains how to find the appropriate AWS CLI commands if unfamiliar, referencing the AWS CLI documentation. The script includes comments for each command, explaining its purpose. The video also covers how to execute the script and the importance of giving it the right permissions. The initial run of the script is shown, demonstrating the basic output generated.

20:04

📝 Improving Script Output and Debugging

This paragraph focuses on enhancing the script's output for better clarity and user experience. Abhishek introduces the use of print statements to label the outputs and explains the use of 'set -x' for debugging purposes, which shows the script's commands and their execution. The script is improved by adding print statements for each resource type, making the output more readable and understandable. The video also demonstrates how enabling 'set -x' affects the script's output, providing a clear trace of the commands being executed.

25:05

🔄 Refining the Script with 'jq' for JSON Parsing

To refine the script further, Abhishek introduces 'jq', a JSON parsing tool, to simplify the output from the AWS CLI commands. He explains how 'jq' can extract specific information, such as instance IDs, from the JSON output returned by the AWS 'describe-instances' command. The video demonstrates how to integrate 'jq' into the script to get a cleaner, more focused output. The improved script is then executed to show the enhanced output, which is easier to read and understand.

📑 Finalizing the Script and Integrating with Cron

The final segment wraps up the script by showing how to redirect the output to a file for reporting purposes. Abhishek explains how to append all script outputs to a single file named 'resource_tracker', which can be reviewed for resource usage reports. The video concludes with an assignment for viewers to write and integrate the script with a Cron job, setting up the stage for future videos that will cover more advanced shell scripting projects.

Mindmap

Keywords

💡DevOps

DevOps is a set of practices that combines software development (Dev) and IT operations (Ops) to shorten the systems development life cycle and provide continuous delivery of value to end users. In the video, the speaker is teaching a DevOps course, which includes a real-time shell script project that DevOps engineers commonly use in cloud infrastructure management.

💡Cloud Infrastructure

Cloud infrastructure refers to the hardware and software resources provided over the internet or a private network. The video discusses the reasons organizations move to cloud infrastructure, such as AWS or Azure, primarily for manageability and cost-effectiveness. It is the environment where the shell script project is applied.

💡Manageability

Manageability in the context of the video refers to the ease of maintaining and operating IT systems. The speaker explains that moving to cloud infrastructure reduces the management overhead, such as the need to manage physical servers and data centers, which is a key benefit for startups and organizations.

💡Cost-effectiveness

Cost-effectiveness is the ability to minimize costs while achieving the same or better output. The video emphasizes that cloud infrastructure allows for a pay-as-you-go model, which can be more cost-effective than maintaining physical infrastructure. The shell script project is aimed at tracking resource usage to ensure cost-effectiveness.

💡AWS CLI

AWS CLI stands for Amazon Web Services Command Line Interface, a tool that allows users to manage AWS services through commands in the command-line tool. The script in the video uses AWS CLI commands to retrieve information about various AWS resources, such as EC2 instances and S3 buckets.

💡Shell Script

A shell script is a script written for the shell, or command-line interface, for task automation. In the video, the speaker is guiding the audience to create a shell script that tracks and reports AWS resource usage, which is a common task for DevOps engineers.

💡Cron Job

A cron job is a scheduled task that runs automatically at specified intervals on Unix-like operating systems. The video mentions integrating the shell script with a cron job to ensure that the resource usage report is generated and delivered at a specific time every day without manual intervention.

💡Resource Tracking

Resource tracking involves monitoring and managing the use of resources in a system. The main theme of the video is creating a shell script for tracking AWS resource usage, such as EC2 instances, S3 buckets, Lambda functions, and IAM users, to maintain cost-effectiveness and manageability.

💡JQ

JQ is a lightweight and flexible command-line JSON processor. In the script discussed in the video, JQ is used to parse the JSON output from AWS CLI commands to extract specific information, such as instance IDs, making the script's output more concise and user-friendly.

💡IAM Users

IAM stands for AWS Identity and Access Management, which is a service that helps in managing users and their access to AWS resources. The video script includes IAM users as one of the resources that the shell script tracks, ensuring that the organization is aware of who has access to AWS services.

Highlights

Introduction to a real-time shell script project used by DevOps engineers on cloud infrastructure.

Explanation of why organizations move to Cloud infrastructure focusing on manageability and cost-effectiveness.

The importance of reducing maintenance overhead and the 'pay as you go' model of cloud services.

Challenge of tracking resource usage to ensure cost-effectiveness in a cloud environment.

The role of DevOps engineers in maintaining cost-effectiveness by tracking resource usage.

Introduction of a shell script as a method to track and report AWS resource usage.

Use of AWS CLI for retrieving information on various AWS services such as S3, EC2, Lambda, and IAM.

Demonstration of writing a shell script to automate the reporting of AWS resource usage.

Explanation of how to use comments in a script for clarity and maintainability.

The use of 'set -x' for debugging shell scripts by printing executed commands and their outputs.

Integration of the shell script with a Cron job for automated and timely reporting.

The concept of a Cron job and its application in scheduling script execution at specific times.

Improving script output clarity by using 'jq', a JSON parsing tool, to filter information.

Technique of redirecting script output to a file for easy access and reporting.

Assignment for viewers to write and integrate the shell script with a Cron job.

Encouragement for viewers to ask questions and engage with the content in the comments section.

Transcripts

play00:01

hello everyone my name is Abhishek and

play00:04

welcome back to my channel so today is

play00:06

day 7 of our full devops course and in

play00:10

this class we'll be talking about a

play00:11

real-time shell script project that

play00:14

usually most of the devops engineers use

play00:16

uh when they are on the cloud

play00:18

infrastructure

play00:20

so firstly let me explain what this

play00:22

project is and why is it useful and then

play00:25

we can take a look at how to create this

play00:28

project okay so for the very uh first

play00:31

thing to say

play00:33

basically why would somebody move to

play00:35

Cloud infrastructure okay let's say

play00:37

whether it's AWS whether it's Azure so

play00:40

what is one of the primary reasons for

play00:42

any organization to move to Cloud so

play00:45

there are two things right so the first

play00:46

thing is the manageability so let's say

play00:50

uh you have a you are a startup and you

play00:54

want to maintain your own servers okay

play00:56

so the

play00:57

main problem here is that there is a lot

play01:00

of Maintenance overhead that is you have

play01:02

to create your own data center you have

play01:04

to manage your own servers you have to

play01:06

keep patching them whenever there is a

play01:08

security issue uh like you know you have

play01:10

to constantly update upgrade those

play01:12

servers so the problem here is that

play01:15

you should have a systems engineering

play01:17

team or a dedicated team who completely

play01:19

takes care of these servers and the

play01:21

systems so this is problem number one

play01:23

and the second problem is cost

play01:27

right so uh all of these providers uh

play01:32

work on a very simple basis that is pay

play01:34

as you go right pay as you use so the

play01:37

concept here is that when you move to

play01:39

Cloud if you are not using certain uh

play01:42

instances so you'll not be built for

play01:44

that whereas if you are buying your

play01:45

physical infrastructure for your company

play01:47

whether you use it or you don't use it

play01:49

you already have it in your data center

play01:51

so you have to pay for it so these are

play01:54

the primary reasons why someone would

play01:56

move to Cloud one is the uh reducing the

play01:59

manage managing overhead or reducing the

play02:03

maintenance overhead and the second

play02:04

thing is to be cost effective

play02:07

so to be cost effective what a

play02:09

organization has to do so let's say uh

play02:12

you are working for example.com and

play02:16

example.com has 100 Developers

play02:18

and you give access to all of these

play02:21

hundred developers to your AWS platform

play02:23

and everybody starts creating their own

play02:26

resources but to be cost effective you

play02:29

have to have your own ways to see if

play02:32

everybody is using the AWS resources up

play02:36

to the point or not let's say there is a

play02:38

developer X so this developer has

play02:40

created

play02:41

100 ec2 instances but nobody is using it

play02:44

or they have created EBS okay so this

play02:48

happens most of the time so EBS volumes

play02:50

are created but no ec2 instance is using

play02:53

the CBS volume so the volumes are left

play02:55

unused so because you are not using it

play02:57

AWS will not understand right so AWS

play02:59

will say that okay you have created an

play03:01

EBS and I have dedicated your volume so

play03:04

I'll be costing you

play03:06

so now as a devops engineer one of your

play03:08

primary responsibility uh I mean as a

play03:12

devops engineer or a AWS admin uh

play03:14

whoever is a person one of your primary

play03:18

responsibilities is to maintain the cost

play03:21

Effectiveness because it is one of the

play03:23

primary reasons why we move to why your

play03:26

organization moved to Cloud so that's

play03:28

why you have to always track the

play03:30

resource usage okay so what is this

play03:32

called tracking the resource usage

play03:38

so there are multiple ways to do this

play03:40

okay so I'm not saying this shell script

play03:42

that I'm showing you is the optimal way

play03:43

of doing it so people use Lambda

play03:45

functions and write some python script

play03:49

to do the same things or people can use

play03:52

see everybody is not convenient with

play03:54

python as it devops in any end of the

play03:56

day if you can achieve your goal that is

play03:58

more than enough so it doesn't mean that

play04:00

you have to write in Python only or you

play04:01

have to write uh using AWS SDK or cdk no

play04:04

you can write in your own ways social

play04:07

scripting is one of the ways to write it

play04:09

so in today's topic what we will do is

play04:12

we will say that there is an

play04:15

organization called example.com

play04:17

okay I'll show you on my AWS account and

play04:21

let's say this organization is uh is

play04:23

only using resources like ec2 because I

play04:27

cannot show you for all the resources it

play04:28

will take a lot of time so we will

play04:30

monitor the resource usage for these

play04:32

resources so one is easy to let's say

play04:35

they are also using S3 and let's say

play04:38

they are using Lambda functions

play04:42

right and let's take one more thing like

play04:45

uh

play04:47

I am because these are few common uh

play04:50

things that are used across different

play04:52

organizations so let's take the same

play04:54

example okay now what is your goal is

play04:57

every day okay every day it let's say 6

play05:01

pm

play05:03

or every day at a certain time you have

play05:06

to give this report to your manager

play05:12

so this is again just for your

play05:14

understanding usually you don't do this

play05:16

what people usually do is that uh using

play05:19

uh this scripts like using shell script

play05:21

or python what they would usually do is

play05:23

they would Supply this information to a

play05:25

reporting dashboard

play05:28

okay so whom would they give this

play05:31

information to a reporting dashboard but

play05:33

for our shell scripting knowledge

play05:36

purpose today let's say that we are

play05:38

doing it uh just for the purpose of your

play05:40

manager

play05:42

okay so ideally you send this

play05:44

information to a reporting dashboard but

play05:46

in our case let's say that today you are

play05:48

giving this information to your manager

play05:49

so how you are giving this information

play05:51

let's say that you are writing a shell

play05:54

script

play05:57

okay and using this shell script what

play06:00

you do is you create a file

play06:03

and this file will have

play06:06

all of this resource usage okay so how

play06:09

many easy to instances are active how

play06:11

many S3 buckets are there how many

play06:13

Lambda functions are there how how many

play06:15

IM users are there you'll try to put all

play06:17

of this information in a file

play06:19

okay and there is one important thing

play06:23

here to notice that I said every day you

play06:26

have to generate this report so one way

play06:28

of doing it is every day you can run

play06:31

your shell script but what is the

play06:33

problem here is let's say you are not

play06:36

available at that point of time or for

play06:38

some reason you are not able to log into

play06:40

that instance and you cannot share that

play06:42

report at 6 PM so the problem is you

play06:44

miss the timeline so instead what a

play06:47

common practice in every organization is

play06:50

this shell script can be integrated with

play06:52

a Quran job

play06:56

so what is this this is a Cron job so

play06:59

what is a Cron job Cron job is if you

play07:01

take a very simple example uh today we

play07:03

are doing uh devops Zero to Hero course

play07:05

right so every day I'm scheduling my

play07:07

video at 7 pm so how I am uh you know

play07:10

most of the times what I do is I just

play07:13

upload the video before but I say

play07:15

YouTube that upload I mean publish this

play07:17

video at 7 pm so what is YouTube doing

play07:20

on behalf of me uh it is publishing the

play07:23

video at 7 pm so I don't have to login

play07:25

uh exactly at 7 pm and click on the

play07:28

publish button so the same way if you

play07:31

create a Cron job what happens is one of

play07:33

your Linux process will wait for the 7

play07:35

pm and once the time is set to 7 PM so

play07:39

it automatically executes the shell

play07:41

script for you this is the concept of

play07:42

Cron job that's it very simple

play07:45

if somebody asks you uh how can you uh

play07:48

make sure that a certain script is

play07:50

running every day at uh X or Y timestamp

play07:53

you can simply say I can make use of the

play07:56

Cron job in Linux and using Chrome job I

play07:58

can execute this script every day at a

play08:01

given point of time okay so this is a

play08:03

very common practice so what are we

play08:05

doing here end of the day so we are

play08:07

going to write a shell script and we are

play08:10

going to integrate this shell script

play08:11

with a chrome job but

play08:13

how do we get all of this information so

play08:16

in the previous classes we learned about

play08:18

AWS

play08:21

CLA right

play08:23

so like I said there are multiple ways

play08:25

to do it you can do it through both or

play08:26

three you can do it using python any

play08:29

other ways but because we are familiar

play08:31

with shell script a bit and we are

play08:33

familiar with awcla a bit so let me

play08:35

combine both of these things okay so let

play08:37

me combine your knowledge on shell

play08:39

scripting your knowledge on AWS CLI and

play08:42

get the output that we require okay so

play08:45

this is what we are going to learn today

play08:46

and this project you can also put in

play08:50

your resume because this is a very

play08:51

generic thing that every organization

play08:53

does okay

play08:55

perfect so what I'll do is I'll stop

play08:58

this screen sharing and I'll show you my

play08:59

terminal and we can start writing the

play09:01

code

play09:04

perfect so let me stop sharing my screen

play09:08

here

play09:10

great and now let me uh pull up my

play09:13

terminal

play09:34

yeah so I'm just doing it they give a

play09:37

minute

play09:40

great done so right now you should be

play09:44

able to scream my

play09:46

uh screen uh

play09:49

yes

play09:50

okay so as I told you what is one of the

play09:54

prerequisites here so one of the preview

play09:56

sites is to have your AWS CLI installed

play09:59

so do I have the AWS CLI let me just uh

play10:02

see it here

play10:04

yeah I have my AWS CLI but what I'm

play10:07

going to do is because I am on my uh Mac

play10:09

uh you might see that there will be a

play10:12

slight difference in the scripting so

play10:13

I'll also connect to a ec2 instance and

play10:16

I'll show you from there so that

play10:17

everybody understands the same scripting

play10:20

using Linux okay so this is my uh Linux

play10:23

box and I'll write the script here if

play10:26

you see I am going to use bash so prefer

play10:29

bash because bash is one of the widely

play10:31

used scriptings and you previously most

play10:35

of the Linux boxes used to come with

play10:37

patch as default but now there is also

play10:40

dash dash that also comes as default so

play10:43

uh always try to learn a platform that

play10:46

is widely used so go for Bash

play10:49

perfect so uh do I have AWS installed uh

play10:53

the AWS CLI yes I have it installed and

play10:55

I've also configured my credentials to

play10:58

communicate with AWS so how do I do that

play11:00

you can use using AWS configure so if

play11:03

you haven't done that run the AWS

play11:04

configure command it will ask you the

play11:06

access key and then once you provide the

play11:09

access key it will ask you for the

play11:10

access uh C sorry secret access key and

play11:14

then it will ask you for the default

play11:16

region and finally your output format

play11:18

that can be Json

play11:20

once you do this your entire AWS

play11:23

configuration authentication is done so

play11:25

if you don't know how to get this access

play11:27

key secret key watch our previous video

play11:28

uh you can get that using your AWS

play11:31

console

play11:33

perfect so I have my batch platform I

play11:37

have AWS I have my AWS configured to

play11:39

authenticate to my AWS account so now I

play11:42

can start proceeding with it so let me

play11:44

start writing it as a script but I'll

play11:46

explain you in a step by step way so

play11:48

I'll not go in a hurry so whenever you

play11:52

find some issues just stop the video

play11:54

there and try to watch it one more time

play11:56

because I'll keep it as simple as

play11:58

possible

play11:59

so let us call this script as

play12:02

AWS uh

play12:05

resource

play12:07

tracker dot sh okay so I opened Bim and

play12:11

now I clicked on Escape I came to the

play12:15

insert mode now I can start writing so

play12:17

she bang followed by slash bin slash

play12:21

bash again always go with the ones that

play12:25

you are using sometimes people uh you

play12:27

know tend to use this one here uh

play12:30

just the sh but uh like I told you in

play12:34

the previous video as well sh is just a

play12:35

symbolic link and this symbolic link can

play12:38

sometimes be bash and sometimes can be

play12:41

Dash so let's say if it is bashed then

play12:44

it is well and good but if it is Dash

play12:45

then you know sometimes your script

play12:48

might fail because there is a slight

play12:50

syntax difference between bash and dash

play12:53

okay so uh shebang slash bin slash bash

play12:57

no what I want to do always start

play13:00

writing about the script because nobody

play13:02

is going to understand what this script

play13:04

is right so that's why uh first start

play13:06

with the line saying that this

play13:09

okay first provide the information like

play13:11

who is the author of it okay so in my

play13:13

case I am the author I'll say Abhishek

play13:15

and then provide the other details like

play13:18

when are you going when did you start

play13:20

writing this script so you can mention

play13:22

it as uh 11th of Jan

play13:28

okay why do you provide all this

play13:30

information because uh in the future

play13:32

whenever somebody has issues with the

play13:34

script or somebody wants to understand

play13:35

who is the author of The Script then

play13:38

they can easily approach you and they

play13:40

can ask you like okay I'm not able to

play13:42

understand this thing here or again uh

play13:45

you can have the version controlling or

play13:46

the version tracking using this so this

play13:48

is a very initial version so let me call

play13:50

V1 or you can also call this a draft

play13:52

version and finally provide information

play13:54

about the script so this Crypt

play14:00

will report the AWS resource

play14:07

usage

play14:09

perfect

play14:13

so this is about the script now I can

play14:16

start writing my script okay so and for

play14:18

the purpose of uh Simplicity I am not

play14:21

going to use uh shell functions because

play14:24

uh you know I can share the script uh

play14:27

in in a GitHub repository using

play14:29

functions but let's try to keep it

play14:31

without functions and as simple as

play14:32

possible because we are just at day 7

play14:35

and many people might not be familiar

play14:36

with shell functions or doing it as

play14:39

modular as possible so I'll try to keep

play14:41

it uh very simple like I mentioned in

play14:43

the uh starting of the video

play14:46

so firstly what are we going to track so

play14:49

for the purpose of that let's again put

play14:51

in some comments to say that what are

play14:54

the objects that we are going to track

play14:55

or what are the resources that we are

play14:57

going to track one is AWS S3

play15:01

again AWS

play15:03

ec2

play15:07

then AWS Lambda

play15:11

and finally AWS

play15:14

I am users

play15:16

so this is something that the script is

play15:18

going to report back

play15:20

firstly if we start with uh S3 and let's

play15:23

say you don't know the uh ec2 sorry AWS

play15:26

CLI commands so what did I tell you in

play15:28

the previous class don't worry if you

play15:30

don't know the AWS CLI commands what you

play15:32

can simply do is you can

play15:36

go for the AWS CLI

play15:39

reference

play15:41

okay so there is a wonderful

play15:43

documentation here and AWS keeps it very

play15:47

simple and neat let's say you want to

play15:49

learn about a specific command so in my

play15:51

case I am going to start with S3 so what

play15:53

I'll do is I'll come here I'll go for

play15:55

the S3 option and what I want to do is I

play15:58

want to just list the buckets right so

play16:00

search for an option called list

play16:03

uh here okay so there is a command

play16:06

called LS I found out that there is a LS

play16:08

command so what you can do is use let me

play16:10

increase the font so that everybody

play16:12

finds it clear okay so it says that you

play16:16

can use the AWS S3 LS and it will list

play16:19

you all the S3 buckets so let me put

play16:22

that here so firstly I'll say AWS S3 LS

play16:27

and before every command definitely put

play16:29

the comments why you have to put the

play16:31

comments because somebody who is reading

play16:33

the script and don't have the scripting

play16:35

knowledge should understand or even

play16:37

somebody has a scripting knowledge but

play16:39

even by looking at the comments they can

play16:40

understand what is the code list S3

play16:43

buckets

play16:45

don't worry I'll also explain you about

play16:47

uh improvising the script by using some

play16:50

set arguments uh how can you run this

play16:52

script in a debug mode how can you uh

play16:55

like you know avoid some pipe issues and

play16:57

certain kind of things uh again we'll

play17:00

improvise the script in a step-by-step

play17:01

manner so I got this here then

play17:04

what is the next thing uh list

play17:08

ec2 instances again let's say that you

play17:13

are new to awcli and you don't know uh

play17:16

how to use this command so what you will

play17:18

do go back to the references here and

play17:21

just a page back because you want to

play17:24

find ec2 not S3 so what I'll do is I'll

play17:28

search for ec2 so this is ec2 and in ec2

play17:32

AWS CLI will show you the bunch of

play17:35

commands that are available here and I

play17:37

know personally that using describe

play17:39

instances you can get this information

play17:42

so what I'll do is I'll directly jump to

play17:45

describe instance

play17:47

foreign

play17:50

that can give you the information of

play17:53

list of AWS resources easy to resources

play17:56

so what I'll do here I'll use this

play17:58

command AWS ec2

play18:01

describe instances if you don't know

play18:03

this command what you will do you will

play18:05

go for the AWS CLI you will read the

play18:08

dock and you will understand

play18:10

and after that what I am going to do

play18:12

here I want to list the AWS Lambda

play18:15

functions so to list the AWS Lambda

play18:18

functions what I will do is list AWS

play18:20

Lambda is something I will call I'll put

play18:23

in the comment section

play18:25

and here I'll say AWS

play18:28

Lambda

play18:32

list functions so this is a command I

play18:35

know the command if you don't know the

play18:36

command you can refer the CLA reference

play18:40

then finally list

play18:42

I am users

play18:45

AWS

play18:47

I am

play18:49

users I think this is a command but uh

play18:51

even I am not sure so what I'll do I'll

play18:53

go back and I'll try to look for the

play18:54

reference

play18:56

so I am yes and in the IM how how should

play19:01

I list the user so let me search for

play19:04

list okay so list groups is available

play19:06

and then list users okay so the command

play19:10

is not just users but

play19:12

list hyphen users perfect so now let me

play19:16

initially run this script and see how we

play19:18

are getting the output

play19:20

I know this is not the end script I'll

play19:23

improvise it but just to stop here and

play19:25

show you the output CH mod let's give

play19:28

the permissions for now let's keep it

play19:30

777 uh usually it's not a good practice

play19:33

to keep it 777 but just uh we can keep

play19:37

it randomly because we are not

play19:38

publishing the script anywhere for now

play19:40

and now I'll say dot slash to execute

play19:42

the script followed by AWS resource

play19:44

tracker.sh and let's see what is the

play19:46

output

play19:48

so it gave me the output here if you see

play19:50

okay let me uh take this into the uh

play19:52

file mode so that you will understand it

play19:54

better

play19:56

and I'll redirect the output into a file

play19:59

mode that is using pipe more so using

play20:01

pipe more you can easily read it in a

play20:04

better way

play20:05

so the first one here is the information

play20:08

about the ec2 instances before that we

play20:11

had the S3 buckets information so this

play20:14

is this is giving the list of all the

play20:17

ec2 instances that are available in my

play20:19

account and then if you go down it also

play20:23

gives you the list of IM users but

play20:26

wait here you are not able to understand

play20:28

which output is uh which one right

play20:31

because the problem here is you haven't

play20:34

used any print statements so for that

play20:37

what we will do is we will modify this

play20:40

script and we will add certain print

play20:42

statements okay so using the print

play20:44

statements you can understand

play20:47

a user can get a debug information so

play20:50

Echo

play20:53

print

play20:56

list of

play20:58

as three buckets okay and similarly what

play21:02

you can do is let me copy this from here

play21:06

paste it here and uh I'll say print list

play21:10

of ec2

play21:15

print list of

play21:22

Lambda function so what are we doing

play21:24

here we are improving the user

play21:25

experience and I'll also show you one

play21:27

more effective way of doing it print

play21:30

list of I am users and then there is a

play21:34

very important function let's say you

play21:36

are looking at some shell scripts that

play21:39

are written by your peers or your

play21:40

seniors in the organization they often

play21:42

use something like this set plus X or

play21:45

set minus E right so why are these

play21:48

things used because this will put your

play21:51

script into a debug mode so what happens

play21:53

is whenever you are running any script

play21:55

it will show you that okay this

play21:57

particular command is getting executed

play21:59

and this is the output so let me also

play22:01

show you what happens if I enable set

play22:03

plus X

play22:05

foreign

play22:07

so let me run this here

play22:11

so it said oh sorry my bad so it it's

play22:15

not the one that I said it should be

play22:19

okay set minus X

play22:22

very sorry for that yeah so uh let me

play22:25

run this one now and see what happens

play22:27

see so it it told you that it is running

play22:30

AWS IM list user so this is the command

play22:33

that you have used and uh you know it

play22:36

said that you are using AWS

play22:38

so it is printing in the commands that

play22:40

you used right so it said AWS Lambda

play22:42

list functions so what is he doing Echo

play22:45

print the list of IM user so there is no

play22:46

IM users in my account so that's why

play22:48

there is not oh sorry so this is the

play22:50

list of IM users so I got the output

play22:53

here whereas there are no Lambda

play22:55

functions so I did not get any output so

play22:57

this way you can

play23:00

get all the information so firstly Eco

play23:03

print list of S3 buckets this is the

play23:05

command that we have used AWS S3 LS and

play23:07

this is the output

play23:09

print list of S3 buckets and there is

play23:11

only one S3 bucket that is penguin

play23:14

flew away so I just created that one and

play23:16

then there is describe instances what is

play23:19

it doing it is printing list of easy to

play23:21

ec2 instances and this is the output and

play23:24

finally you have the list of IM users so

play23:27

this way you can create a report but

play23:30

again there is a problem here right so

play23:32

this report is generating the outputs

play23:34

but it is not uh quite clear to me so

play23:37

what I thought is I'll only give the AWS

play23:39

instance IDs because this is a bunch of

play23:42

information right so if I take all this

play23:45

information it will not be clear for my

play23:47

manager what is the instance ID because

play23:49

the AWS describe instance command is

play23:52

giving me a lot of information but I

play23:54

just need AWS instance ID so for that

play23:57

there is a very special command in Linux

play23:59

that is called JQ because this is a Json

play24:02

that uh your AWS is returning what we

play24:05

can do is we can read through the J Cube

play24:07

we can read through the Json in using

play24:09

jqu and we can just get the instance ID

play24:13

so I'll show you how to do that now the

play24:15

output will become much simpler so for

play24:17

example this is the command that we were

play24:19

using right AWS

play24:21

ec2

play24:23

describe hyphen instances

play24:26

and if you look at the output output is

play24:28

really big but I just want the

play24:31

information that I need that is the

play24:33

instance ID so for that what I'll do is

play24:35

I'll use the pipe to send the output of

play24:37

this command to JQ

play24:39

okay what is JQ does JQ will just get a

play24:43

it will it's like a Json parser to get

play24:46

the information from Json similarly like

play24:48

JQ you also have yq yq is used for uh

play24:51

yaml parser JQ is used for Json parser

play24:54

so uh always as a devops internets we

play24:56

deal with jsons and yamls so get

play24:58

familiar with JQ and yq so I'll show you

play25:01

how to use JQ as well and then in single

play25:03

quotes you have to just provide

play25:05

information of what are the things that

play25:06

you are trying to uh like for example I

play25:09

am trying to get the instance ID but if

play25:11

I look at the level of instance ID it is

play25:13

inside instances and it is inside the

play25:15

reservations so for that what I have to

play25:17

do is Dot

play25:19

reservations so now it will only give

play25:22

the reservations information that is

play25:23

from here but again I want only

play25:25

instances okay so for that what I'll do

play25:28

is Dot instances

play25:31

so I am using this brackets is because

play25:33

this is a list right instance is not a

play25:35

single instance if it was a single

play25:37

instance you can just use this but it is

play25:39

a list so I am using this packets Dot

play25:42

instance ID now it will give it will

play25:45

give me any number of instance IDs that

play25:47

are available okay so if you see here

play25:49

this is just the instance ID that I have

play25:51

now I can replace this command in my

play25:53

script and I can simplify my script

play25:55

because my script earlier was very big

play25:58

and I mean script was very simple but

play26:00

the output was very big but uh anybody

play26:03

who is looking at that they might not be

play26:05

interested so I can simplify this using

play26:06

the command that I just created

play26:09

now let me run the script and see what

play26:11

happens

play26:12

okay so if we rerun the script what will

play26:16

happen is it will give me a

play26:19

better user experience to command

play26:27

perfect so if you see here no this this

play26:29

is just the output uh the output started

play26:32

from here it said Echo print list of S3

play26:34

buckets it gave me the command that I

play26:37

used and this is the output here and

play26:39

then uh print list of ec2 buckets this

play26:42

is the command that I've used the JQ

play26:43

command and this is the output perfect

play26:46

and then this is the list of Lambda

play26:48

functions this is the command I've used

play26:50

and this is the output list of IM users

play26:52

this is a command that I've used and

play26:54

this is the output perfect now what you

play26:57

will do here is just integrate this

play26:59

script to a Chrome tab and your output

play27:01

is done because okay let me also show

play27:04

you a thing that how to put this

play27:06

information into a file

play27:09

so for doing that it's very simple what

play27:11

you can do is every output you can

play27:13

redirect it to a file how do you

play27:15

redirect into a file just improvise the

play27:17

script and say uh like you know

play27:21

just output to a file called

play27:24

resource tracker every output

play27:27

you can just redirect to that file using

play27:30

this command here

play27:32

so that every information goes to

play27:34

Resource tracker and somebody can just

play27:36

open the resource tracker and they can

play27:37

see now what I'll do is I'll definitely

play27:40

stop here so that I can give you an

play27:42

assignment and write the same script and

play27:44

integrate it with cron tab if you have

play27:46

any questions I can explain you in the

play27:48

next video but it is very simple right I

play27:50

already explained you the script just

play27:51

integrate it with Chrome tab and in the

play27:54

future video we will see a advanced

play27:56

shell script project this is a very

play27:58

simple one so I hope you like this video

play28:00

if you have any questions please post in

play28:03

the comment section and don't forget to

play28:05

like this video and share with your

play28:06

friends thank you so much

Rate This

5.0 / 5 (0 votes)

Ähnliche Tags
DevOpsShell ScriptCloud InfrastructureAWS CLIResource ManagementCost EfficiencyScripting TutorialAutomationMaintainabilityCloud Migration
Benötigen Sie eine Zusammenfassung auf Englisch?