Day-7 | Live AWS Project using SHELL SCRIPTING for DevOps | AWS DevOps project| #devops #aws #2023
Summary
TLDRIn this DevOps tutorial, Abhishek introduces a shell script project designed for cloud infrastructure management, particularly useful for tracking AWS resource usage. He explains the motivations for moving to the cloud, focusing on manageability and cost-effectiveness. The script, which can be integrated with a cron job for automated reporting, retrieves information on S3 buckets, EC2 instances, Lambda functions, and IAM users, enhancing organizational resource monitoring and cost control.
Takeaways
- 😀 The video is part of a full DevOps course and focuses on a real-time shell script project used by DevOps engineers on cloud infrastructure.
- 🏭 The primary reasons for moving to cloud infrastructure are manageability and cost-effectiveness, reducing maintenance overhead and allowing pay-as-you-go usage.
- 💰 Organizations must track resource usage to ensure cost-effectiveness, as cloud providers charge for unused instances and resources.
- 🛠️ The shell script project aims to report on AWS resource usage, such as EC2 instances, S3 buckets, Lambda functions, and IAM users, to maintain cost-effectiveness.
- 📝 The script is a simple way to generate daily reports for managers, which can also be integrated with a reporting dashboard for continuous monitoring.
- 🔧 The script can be improved with comments and print statements to enhance readability and provide better user experience and debugging information.
- 🔄 The script uses AWS CLI commands to retrieve information about various AWS resources and can be customized based on the organization's needs.
- 📜 The video demonstrates how to write and execute a shell script, including giving it executable permissions and running it to see the output.
- 🔍 The use of 'jq', a JSON parser, is highlighted to simplify and parse the output from AWS CLI commands to get specific information like instance IDs.
- ⏰ The script can be scheduled to run at specific times using a Cron job, ensuring that the resource usage report is generated and delivered on time.
- 📚 The video concludes with an assignment for viewers to write and integrate the shell script with a Cron job, and offers support for questions in the comment section.
Q & A
What is the main focus of the video script?
-The video script focuses on teaching viewers how to create a shell script project for tracking AWS resource usage, which is a common task for DevOps engineers working with cloud infrastructure.
Why would an organization move to cloud infrastructure like AWS or Azure?
-Organizations move to cloud infrastructure primarily for two reasons: to reduce management overhead by eliminating the need to maintain their own data centers and servers, and to be cost-effective by paying only for the resources they use on a pay-as-you-go basis.
What is a common issue that organizations face when moving to the cloud?
-A common issue is ensuring cost-effectiveness by monitoring and managing resource usage to avoid paying for unused instances or services, which can lead to unnecessary expenses.
What is the role of a DevOps engineer or AWS admin in managing cloud resources?
-A DevOps engineer or AWS admin is primarily responsible for maintaining cost-effectiveness by tracking resource usage and ensuring that resources are being used optimally and not left idle or underutilized.
What is the purpose of the shell script project discussed in the video?
-The shell script project aims to generate a daily report of AWS resource usage, including EC2 instances, S3 buckets, Lambda functions, and IAM users, to help monitor and manage cloud resources effectively.
How can the shell script be executed automatically at a specific time daily?
-The shell script can be integrated with a Cron job, which is a time-based job scheduler in Unix-like operating systems. A Cron job can be set up to automatically execute the script at a specified time every day.
What is the significance of using comments in the shell script?
-Comments in the shell script are important for providing context and understanding to anyone reading the script, including those who may not have scripting knowledge. They explain what each part of the script is intended to do.
What is the purpose of the 'set -x' command in the shell script?
-The 'set -x' command is used to enable debug mode in the shell script. It causes the script to print each command that is executed, along with its output, which can be helpful for troubleshooting and understanding the script's operation.
How can the output of the AWS CLI commands be simplified to show only the necessary information?
-The output can be simplified using 'jq', a JSON parser command-line tool. By piping the output of AWS CLI commands to 'jq', specific pieces of information, such as instance IDs, can be extracted and displayed, making the output more concise and relevant.
What is the final step suggested in the video script for handling the script's output?
-The final step suggested is to redirect the output of the script to a file called 'resource_tracker'. This allows for easy access and review of the resource usage report by simply opening the file.
Outlines
🚀 Introduction to the DevOps Shell Script Project
In this introductory segment, Abhishek, the presenter, welcomes viewers back to his channel and introduces Day 7 of the full DevOps course. The focus is on a real-time shell script project commonly used by DevOps engineers in cloud infrastructure. He outlines the purpose of the project, its usefulness, and sets the stage for an explanation of how to create it. The video also delves into why organizations move to cloud infrastructure, highlighting manageability and cost-effectiveness as primary reasons. The importance of tracking resource usage to maintain cost-effectiveness is emphasized, and the role of a DevOps engineer or AWS admin in ensuring this is discussed.
📈 Understanding Cloud Infrastructure and Cost Tracking
The second paragraph delves deeper into the reasons for adopting cloud infrastructure, such as reduced maintenance overhead and the pay-as-you-go model. It discusses the challenges of managing resources in a cloud environment, particularly the need for oversight to prevent wastage. The script's role in generating daily reports on resource usage is introduced, with an example of how a company might monitor its AWS resources. The paragraph also touches on the importance of integrating such scripts with a Cron job for automated, timely reporting.
🔧 Setting Up the Shell Script for AWS Resource Tracking
In this segment, Abhishek demonstrates the initial setup for a shell script designed to track AWS resources. He explains the importance of using a widely recognized scripting language like Bash and the necessity of having the AWS CLI installed and configured. The video shows the process of creating a script file, adding a shebang line, and providing metadata about the script, such as the author, version, and purpose. The script's basic structure is outlined, with comments indicating the types of resources it will track.
🛠 Writing the Script to List AWS Resources
The presenter begins writing the actual script, starting with commands to list AWS S3 buckets and EC2 instances. He explains how to find the appropriate AWS CLI commands if unfamiliar, referencing the AWS CLI documentation. The script includes comments for each command, explaining its purpose. The video also covers how to execute the script and the importance of giving it the right permissions. The initial run of the script is shown, demonstrating the basic output generated.
📝 Improving Script Output and Debugging
This paragraph focuses on enhancing the script's output for better clarity and user experience. Abhishek introduces the use of print statements to label the outputs and explains the use of 'set -x' for debugging purposes, which shows the script's commands and their execution. The script is improved by adding print statements for each resource type, making the output more readable and understandable. The video also demonstrates how enabling 'set -x' affects the script's output, providing a clear trace of the commands being executed.
🔄 Refining the Script with 'jq' for JSON Parsing
To refine the script further, Abhishek introduces 'jq', a JSON parsing tool, to simplify the output from the AWS CLI commands. He explains how 'jq' can extract specific information, such as instance IDs, from the JSON output returned by the AWS 'describe-instances' command. The video demonstrates how to integrate 'jq' into the script to get a cleaner, more focused output. The improved script is then executed to show the enhanced output, which is easier to read and understand.
📑 Finalizing the Script and Integrating with Cron
The final segment wraps up the script by showing how to redirect the output to a file for reporting purposes. Abhishek explains how to append all script outputs to a single file named 'resource_tracker', which can be reviewed for resource usage reports. The video concludes with an assignment for viewers to write and integrate the script with a Cron job, setting up the stage for future videos that will cover more advanced shell scripting projects.
Mindmap
Keywords
💡DevOps
💡Cloud Infrastructure
💡Manageability
💡Cost-effectiveness
💡AWS CLI
💡Shell Script
💡Cron Job
💡Resource Tracking
💡JQ
💡IAM Users
Highlights
Introduction to a real-time shell script project used by DevOps engineers on cloud infrastructure.
Explanation of why organizations move to Cloud infrastructure focusing on manageability and cost-effectiveness.
The importance of reducing maintenance overhead and the 'pay as you go' model of cloud services.
Challenge of tracking resource usage to ensure cost-effectiveness in a cloud environment.
The role of DevOps engineers in maintaining cost-effectiveness by tracking resource usage.
Introduction of a shell script as a method to track and report AWS resource usage.
Use of AWS CLI for retrieving information on various AWS services such as S3, EC2, Lambda, and IAM.
Demonstration of writing a shell script to automate the reporting of AWS resource usage.
Explanation of how to use comments in a script for clarity and maintainability.
The use of 'set -x' for debugging shell scripts by printing executed commands and their outputs.
Integration of the shell script with a Cron job for automated and timely reporting.
The concept of a Cron job and its application in scheduling script execution at specific times.
Improving script output clarity by using 'jq', a JSON parsing tool, to filter information.
Technique of redirecting script output to a file for easy access and reporting.
Assignment for viewers to write and integrate the shell script with a Cron job.
Encouragement for viewers to ask questions and engage with the content in the comments section.
Transcripts
hello everyone my name is Abhishek and
welcome back to my channel so today is
day 7 of our full devops course and in
this class we'll be talking about a
real-time shell script project that
usually most of the devops engineers use
uh when they are on the cloud
infrastructure
so firstly let me explain what this
project is and why is it useful and then
we can take a look at how to create this
project okay so for the very uh first
thing to say
basically why would somebody move to
Cloud infrastructure okay let's say
whether it's AWS whether it's Azure so
what is one of the primary reasons for
any organization to move to Cloud so
there are two things right so the first
thing is the manageability so let's say
uh you have a you are a startup and you
want to maintain your own servers okay
so the
main problem here is that there is a lot
of Maintenance overhead that is you have
to create your own data center you have
to manage your own servers you have to
keep patching them whenever there is a
security issue uh like you know you have
to constantly update upgrade those
servers so the problem here is that
you should have a systems engineering
team or a dedicated team who completely
takes care of these servers and the
systems so this is problem number one
and the second problem is cost
right so uh all of these providers uh
work on a very simple basis that is pay
as you go right pay as you use so the
concept here is that when you move to
Cloud if you are not using certain uh
instances so you'll not be built for
that whereas if you are buying your
physical infrastructure for your company
whether you use it or you don't use it
you already have it in your data center
so you have to pay for it so these are
the primary reasons why someone would
move to Cloud one is the uh reducing the
manage managing overhead or reducing the
maintenance overhead and the second
thing is to be cost effective
so to be cost effective what a
organization has to do so let's say uh
you are working for example.com and
example.com has 100 Developers
and you give access to all of these
hundred developers to your AWS platform
and everybody starts creating their own
resources but to be cost effective you
have to have your own ways to see if
everybody is using the AWS resources up
to the point or not let's say there is a
developer X so this developer has
created
100 ec2 instances but nobody is using it
or they have created EBS okay so this
happens most of the time so EBS volumes
are created but no ec2 instance is using
the CBS volume so the volumes are left
unused so because you are not using it
AWS will not understand right so AWS
will say that okay you have created an
EBS and I have dedicated your volume so
I'll be costing you
so now as a devops engineer one of your
primary responsibility uh I mean as a
devops engineer or a AWS admin uh
whoever is a person one of your primary
responsibilities is to maintain the cost
Effectiveness because it is one of the
primary reasons why we move to why your
organization moved to Cloud so that's
why you have to always track the
resource usage okay so what is this
called tracking the resource usage
so there are multiple ways to do this
okay so I'm not saying this shell script
that I'm showing you is the optimal way
of doing it so people use Lambda
functions and write some python script
to do the same things or people can use
see everybody is not convenient with
python as it devops in any end of the
day if you can achieve your goal that is
more than enough so it doesn't mean that
you have to write in Python only or you
have to write uh using AWS SDK or cdk no
you can write in your own ways social
scripting is one of the ways to write it
so in today's topic what we will do is
we will say that there is an
organization called example.com
okay I'll show you on my AWS account and
let's say this organization is uh is
only using resources like ec2 because I
cannot show you for all the resources it
will take a lot of time so we will
monitor the resource usage for these
resources so one is easy to let's say
they are also using S3 and let's say
they are using Lambda functions
right and let's take one more thing like
uh
I am because these are few common uh
things that are used across different
organizations so let's take the same
example okay now what is your goal is
every day okay every day it let's say 6
pm
or every day at a certain time you have
to give this report to your manager
so this is again just for your
understanding usually you don't do this
what people usually do is that uh using
uh this scripts like using shell script
or python what they would usually do is
they would Supply this information to a
reporting dashboard
okay so whom would they give this
information to a reporting dashboard but
for our shell scripting knowledge
purpose today let's say that we are
doing it uh just for the purpose of your
manager
okay so ideally you send this
information to a reporting dashboard but
in our case let's say that today you are
giving this information to your manager
so how you are giving this information
let's say that you are writing a shell
script
okay and using this shell script what
you do is you create a file
and this file will have
all of this resource usage okay so how
many easy to instances are active how
many S3 buckets are there how many
Lambda functions are there how how many
IM users are there you'll try to put all
of this information in a file
okay and there is one important thing
here to notice that I said every day you
have to generate this report so one way
of doing it is every day you can run
your shell script but what is the
problem here is let's say you are not
available at that point of time or for
some reason you are not able to log into
that instance and you cannot share that
report at 6 PM so the problem is you
miss the timeline so instead what a
common practice in every organization is
this shell script can be integrated with
a Quran job
so what is this this is a Cron job so
what is a Cron job Cron job is if you
take a very simple example uh today we
are doing uh devops Zero to Hero course
right so every day I'm scheduling my
video at 7 pm so how I am uh you know
most of the times what I do is I just
upload the video before but I say
YouTube that upload I mean publish this
video at 7 pm so what is YouTube doing
on behalf of me uh it is publishing the
video at 7 pm so I don't have to login
uh exactly at 7 pm and click on the
publish button so the same way if you
create a Cron job what happens is one of
your Linux process will wait for the 7
pm and once the time is set to 7 PM so
it automatically executes the shell
script for you this is the concept of
Cron job that's it very simple
if somebody asks you uh how can you uh
make sure that a certain script is
running every day at uh X or Y timestamp
you can simply say I can make use of the
Cron job in Linux and using Chrome job I
can execute this script every day at a
given point of time okay so this is a
very common practice so what are we
doing here end of the day so we are
going to write a shell script and we are
going to integrate this shell script
with a chrome job but
how do we get all of this information so
in the previous classes we learned about
AWS
CLA right
so like I said there are multiple ways
to do it you can do it through both or
three you can do it using python any
other ways but because we are familiar
with shell script a bit and we are
familiar with awcla a bit so let me
combine both of these things okay so let
me combine your knowledge on shell
scripting your knowledge on AWS CLI and
get the output that we require okay so
this is what we are going to learn today
and this project you can also put in
your resume because this is a very
generic thing that every organization
does okay
perfect so what I'll do is I'll stop
this screen sharing and I'll show you my
terminal and we can start writing the
code
perfect so let me stop sharing my screen
here
great and now let me uh pull up my
terminal
yeah so I'm just doing it they give a
minute
great done so right now you should be
able to scream my
uh screen uh
yes
okay so as I told you what is one of the
prerequisites here so one of the preview
sites is to have your AWS CLI installed
so do I have the AWS CLI let me just uh
see it here
yeah I have my AWS CLI but what I'm
going to do is because I am on my uh Mac
uh you might see that there will be a
slight difference in the scripting so
I'll also connect to a ec2 instance and
I'll show you from there so that
everybody understands the same scripting
using Linux okay so this is my uh Linux
box and I'll write the script here if
you see I am going to use bash so prefer
bash because bash is one of the widely
used scriptings and you previously most
of the Linux boxes used to come with
patch as default but now there is also
dash dash that also comes as default so
uh always try to learn a platform that
is widely used so go for Bash
perfect so uh do I have AWS installed uh
the AWS CLI yes I have it installed and
I've also configured my credentials to
communicate with AWS so how do I do that
you can use using AWS configure so if
you haven't done that run the AWS
configure command it will ask you the
access key and then once you provide the
access key it will ask you for the
access uh C sorry secret access key and
then it will ask you for the default
region and finally your output format
that can be Json
once you do this your entire AWS
configuration authentication is done so
if you don't know how to get this access
key secret key watch our previous video
uh you can get that using your AWS
console
perfect so I have my batch platform I
have AWS I have my AWS configured to
authenticate to my AWS account so now I
can start proceeding with it so let me
start writing it as a script but I'll
explain you in a step by step way so
I'll not go in a hurry so whenever you
find some issues just stop the video
there and try to watch it one more time
because I'll keep it as simple as
possible
so let us call this script as
AWS uh
resource
tracker dot sh okay so I opened Bim and
now I clicked on Escape I came to the
insert mode now I can start writing so
she bang followed by slash bin slash
bash again always go with the ones that
you are using sometimes people uh you
know tend to use this one here uh
just the sh but uh like I told you in
the previous video as well sh is just a
symbolic link and this symbolic link can
sometimes be bash and sometimes can be
Dash so let's say if it is bashed then
it is well and good but if it is Dash
then you know sometimes your script
might fail because there is a slight
syntax difference between bash and dash
okay so uh shebang slash bin slash bash
no what I want to do always start
writing about the script because nobody
is going to understand what this script
is right so that's why uh first start
with the line saying that this
okay first provide the information like
who is the author of it okay so in my
case I am the author I'll say Abhishek
and then provide the other details like
when are you going when did you start
writing this script so you can mention
it as uh 11th of Jan
okay why do you provide all this
information because uh in the future
whenever somebody has issues with the
script or somebody wants to understand
who is the author of The Script then
they can easily approach you and they
can ask you like okay I'm not able to
understand this thing here or again uh
you can have the version controlling or
the version tracking using this so this
is a very initial version so let me call
V1 or you can also call this a draft
version and finally provide information
about the script so this Crypt
will report the AWS resource
usage
perfect
so this is about the script now I can
start writing my script okay so and for
the purpose of uh Simplicity I am not
going to use uh shell functions because
uh you know I can share the script uh
in in a GitHub repository using
functions but let's try to keep it
without functions and as simple as
possible because we are just at day 7
and many people might not be familiar
with shell functions or doing it as
modular as possible so I'll try to keep
it uh very simple like I mentioned in
the uh starting of the video
so firstly what are we going to track so
for the purpose of that let's again put
in some comments to say that what are
the objects that we are going to track
or what are the resources that we are
going to track one is AWS S3
again AWS
ec2
then AWS Lambda
and finally AWS
I am users
so this is something that the script is
going to report back
firstly if we start with uh S3 and let's
say you don't know the uh ec2 sorry AWS
CLI commands so what did I tell you in
the previous class don't worry if you
don't know the AWS CLI commands what you
can simply do is you can
go for the AWS CLI
reference
okay so there is a wonderful
documentation here and AWS keeps it very
simple and neat let's say you want to
learn about a specific command so in my
case I am going to start with S3 so what
I'll do is I'll come here I'll go for
the S3 option and what I want to do is I
want to just list the buckets right so
search for an option called list
uh here okay so there is a command
called LS I found out that there is a LS
command so what you can do is use let me
increase the font so that everybody
finds it clear okay so it says that you
can use the AWS S3 LS and it will list
you all the S3 buckets so let me put
that here so firstly I'll say AWS S3 LS
and before every command definitely put
the comments why you have to put the
comments because somebody who is reading
the script and don't have the scripting
knowledge should understand or even
somebody has a scripting knowledge but
even by looking at the comments they can
understand what is the code list S3
buckets
don't worry I'll also explain you about
uh improvising the script by using some
set arguments uh how can you run this
script in a debug mode how can you uh
like you know avoid some pipe issues and
certain kind of things uh again we'll
improvise the script in a step-by-step
manner so I got this here then
what is the next thing uh list
ec2 instances again let's say that you
are new to awcli and you don't know uh
how to use this command so what you will
do go back to the references here and
just a page back because you want to
find ec2 not S3 so what I'll do is I'll
search for ec2 so this is ec2 and in ec2
AWS CLI will show you the bunch of
commands that are available here and I
know personally that using describe
instances you can get this information
so what I'll do is I'll directly jump to
describe instance
foreign
that can give you the information of
list of AWS resources easy to resources
so what I'll do here I'll use this
command AWS ec2
describe instances if you don't know
this command what you will do you will
go for the AWS CLI you will read the
dock and you will understand
and after that what I am going to do
here I want to list the AWS Lambda
functions so to list the AWS Lambda
functions what I will do is list AWS
Lambda is something I will call I'll put
in the comment section
and here I'll say AWS
Lambda
list functions so this is a command I
know the command if you don't know the
command you can refer the CLA reference
then finally list
I am users
AWS
I am
users I think this is a command but uh
even I am not sure so what I'll do I'll
go back and I'll try to look for the
reference
so I am yes and in the IM how how should
I list the user so let me search for
list okay so list groups is available
and then list users okay so the command
is not just users but
list hyphen users perfect so now let me
initially run this script and see how we
are getting the output
I know this is not the end script I'll
improvise it but just to stop here and
show you the output CH mod let's give
the permissions for now let's keep it
777 uh usually it's not a good practice
to keep it 777 but just uh we can keep
it randomly because we are not
publishing the script anywhere for now
and now I'll say dot slash to execute
the script followed by AWS resource
tracker.sh and let's see what is the
output
so it gave me the output here if you see
okay let me uh take this into the uh
file mode so that you will understand it
better
and I'll redirect the output into a file
mode that is using pipe more so using
pipe more you can easily read it in a
better way
so the first one here is the information
about the ec2 instances before that we
had the S3 buckets information so this
is this is giving the list of all the
ec2 instances that are available in my
account and then if you go down it also
gives you the list of IM users but
wait here you are not able to understand
which output is uh which one right
because the problem here is you haven't
used any print statements so for that
what we will do is we will modify this
script and we will add certain print
statements okay so using the print
statements you can understand
a user can get a debug information so
Echo
list of
as three buckets okay and similarly what
you can do is let me copy this from here
paste it here and uh I'll say print list
of ec2
print list of
Lambda function so what are we doing
here we are improving the user
experience and I'll also show you one
more effective way of doing it print
list of I am users and then there is a
very important function let's say you
are looking at some shell scripts that
are written by your peers or your
seniors in the organization they often
use something like this set plus X or
set minus E right so why are these
things used because this will put your
script into a debug mode so what happens
is whenever you are running any script
it will show you that okay this
particular command is getting executed
and this is the output so let me also
show you what happens if I enable set
plus X
foreign
so let me run this here
so it said oh sorry my bad so it it's
not the one that I said it should be
okay set minus X
very sorry for that yeah so uh let me
run this one now and see what happens
see so it it told you that it is running
AWS IM list user so this is the command
that you have used and uh you know it
said that you are using AWS
so it is printing in the commands that
you used right so it said AWS Lambda
list functions so what is he doing Echo
print the list of IM user so there is no
IM users in my account so that's why
there is not oh sorry so this is the
list of IM users so I got the output
here whereas there are no Lambda
functions so I did not get any output so
this way you can
get all the information so firstly Eco
print list of S3 buckets this is the
command that we have used AWS S3 LS and
this is the output
print list of S3 buckets and there is
only one S3 bucket that is penguin
flew away so I just created that one and
then there is describe instances what is
it doing it is printing list of easy to
ec2 instances and this is the output and
finally you have the list of IM users so
this way you can create a report but
again there is a problem here right so
this report is generating the outputs
but it is not uh quite clear to me so
what I thought is I'll only give the AWS
instance IDs because this is a bunch of
information right so if I take all this
information it will not be clear for my
manager what is the instance ID because
the AWS describe instance command is
giving me a lot of information but I
just need AWS instance ID so for that
there is a very special command in Linux
that is called JQ because this is a Json
that uh your AWS is returning what we
can do is we can read through the J Cube
we can read through the Json in using
jqu and we can just get the instance ID
so I'll show you how to do that now the
output will become much simpler so for
example this is the command that we were
using right AWS
ec2
describe hyphen instances
and if you look at the output output is
really big but I just want the
information that I need that is the
instance ID so for that what I'll do is
I'll use the pipe to send the output of
this command to JQ
okay what is JQ does JQ will just get a
it will it's like a Json parser to get
the information from Json similarly like
JQ you also have yq yq is used for uh
yaml parser JQ is used for Json parser
so uh always as a devops internets we
deal with jsons and yamls so get
familiar with JQ and yq so I'll show you
how to use JQ as well and then in single
quotes you have to just provide
information of what are the things that
you are trying to uh like for example I
am trying to get the instance ID but if
I look at the level of instance ID it is
inside instances and it is inside the
reservations so for that what I have to
do is Dot
reservations so now it will only give
the reservations information that is
from here but again I want only
instances okay so for that what I'll do
is Dot instances
so I am using this brackets is because
this is a list right instance is not a
single instance if it was a single
instance you can just use this but it is
a list so I am using this packets Dot
instance ID now it will give it will
give me any number of instance IDs that
are available okay so if you see here
this is just the instance ID that I have
now I can replace this command in my
script and I can simplify my script
because my script earlier was very big
and I mean script was very simple but
the output was very big but uh anybody
who is looking at that they might not be
interested so I can simplify this using
the command that I just created
now let me run the script and see what
happens
okay so if we rerun the script what will
happen is it will give me a
better user experience to command
perfect so if you see here no this this
is just the output uh the output started
from here it said Echo print list of S3
buckets it gave me the command that I
used and this is the output here and
then uh print list of ec2 buckets this
is the command that I've used the JQ
command and this is the output perfect
and then this is the list of Lambda
functions this is the command I've used
and this is the output list of IM users
this is a command that I've used and
this is the output perfect now what you
will do here is just integrate this
script to a Chrome tab and your output
is done because okay let me also show
you a thing that how to put this
information into a file
so for doing that it's very simple what
you can do is every output you can
redirect it to a file how do you
redirect into a file just improvise the
script and say uh like you know
just output to a file called
resource tracker every output
you can just redirect to that file using
this command here
so that every information goes to
Resource tracker and somebody can just
open the resource tracker and they can
see now what I'll do is I'll definitely
stop here so that I can give you an
assignment and write the same script and
integrate it with cron tab if you have
any questions I can explain you in the
next video but it is very simple right I
already explained you the script just
integrate it with Chrome tab and in the
future video we will see a advanced
shell script project this is a very
simple one so I hope you like this video
if you have any questions please post in
the comment section and don't forget to
like this video and share with your
friends thank you so much
تصفح المزيد من مقاطع الفيديو ذات الصلة
AWS & Cloud Computing for beginners | 50 Services in 50 Minutes
Amazon Redshift Tutorial | Amazon Redshift Architecture | AWS Tutorial For Beginners | Simplilearn
Public Cloud Explained
Day-16 | Infrastructure as Code | #terraform #IaC
Datadog 101 Course | Datadog Tutorial for Beginners | SRE | DevOps
24 MOST Popular AWS Services - Explained in 13 mins (2024)
5.0 / 5 (0 votes)