Practical Projects to Learn AWS
Summary
TLDRDieses Video bietet einen praktischen Einstieg in AWS, indem es vier Projekte vorstellt, die helfen, die Cloud-Plattform zu erlernen und zu beherrschen. Es beginnt mit dem Hosten einer statischen Website auf S3 und Route 53, führt durch die Entwicklung einer CRUD-Anwendung mit ECS, ECR und RDS und zeigt, wie man Datenverarbeitung mit Kinesis Firehose, S3, Lambda und Elasticsearch umsetzt. Abschließend wird ein serverloses Workflow-Beispiel zur Erkennung von Aktienkursbewegungen mit CloudWatch, SQS, DynamoDB und SES erläutert. Jedes Projekt ist ideal, um AWS-Konzepte zu verinnerlichen und ist für Anfänger und Fortgeschrittene von großem Nutzen.
Takeaways
- 😀 Der Videoinhalt ist hauptsächlich über das Lernen von AWS und wie man damit anfängt.
- 🔍 Es wird empfohlen, mit dem AWS Cloud Practitioner-Kurs zu beginnen, um sich mit verschiedenen Diensten und Konzepten vertraut zu machen.
- 🛠 Um praktisches Wissen zu erlangen, wird vorgeschlagen, verschiedene Projekte durchzuführen, um sich mit AWS vertraut zu machen.
- 🌐 Das erste Projekt ist das Hosten einer statischen Website mit S3 und Route 53.
- 📝 Ein weiterer praktischer Ansatz ist die Erstellung einer CRUD-Anwendung mit Docker-Containern und ECS.
- 💧 Ein drittes Projekt befasst sich mit dem Verbrauch von Twitter-Streams und der Verarbeitung von Tweets mit Kinesis Firehose und Elasticsearch.
- 📈 Das vierte Projekt ist eine verteilte serverlose Workflow-Implementierung für Aktienkursbewegungen.
- 📊 Für das Projekt zur Aktienkursbeobachtung wird CloudWatch Events, SQS, Lambda, DynamoDB und DynamoDB Streams verwendet.
- 📱 Eine Benachrichtigung wird gesendet, wenn signifikante Preisänderungen bei Aktien festgestellt werden.
- 🎥 Es werden Videos zu jedem der genannten Projekten erstellt, um den Lernprozess zu unterstützen.
Q & A
Wo sollte man als Anfänger mit dem Lernen von AWS anfangen?
-Man sollte mit dem AWS Cloud Practitioner-Kurs beginnen, der von AWS angeboten wird, um sich mit verschiedenen Diensten und Konzepten vertraut zu machen.
Wie kann man AWS praktisch lernen?
-Man kann durch das Durchführen von Projekten lernen, die auf realen Anwendungsfällen basieren, wie zum Beispiel das Hosten einer statischen Website auf S3, die Erstellung einer CRUD-Anwendung mit ECS und RDS oder das Verarbeiten von Daten mit Kinesis Firehose und Elasticsearch.
Was ist der Vorteil des Verwendens von CloudFront anstelle von direktem Zugriff auf S3?
-CloudFront ist ein Edge-Caching-Dienst, der für Ressourcen, die nicht oft geändert werden, sehr kosteneffizient ist. Es verteilt Dateien zu verschiedenen Edge-Knoten auf der ganzen Welt, um eine optimale Leistung zu gewährleisten.
Was ist Route 53 und wie wird es in AWS verwendet?
-Route 53 ist ein DNS-Webdienst, der zum Bearbeiten von DNS-Einstellungen für das AWS-Konto verwendet wird, um eine benutzerdefinierte Domäne an eine Index.html-Datei oder einen Load Balancer zu mappen.
Wie funktioniert die Erstellung einer CRUD-Anwendung mit Docker-Containern in AWS?
-Man schreibt eine Docker-Datei, lädt das Image in den Elastic Container Registry (ECR) hoch und konfiguriert den ECS-Dienst, um auf das ECR-Image zu verweisen. Der ECS-Cluster wird in einem VPC gestartet, und man stellt eine RDS-Datenbank ein, um die Anwendung zu unterstützen.
Was ist Kinesis Firehose und wie wird es in AWS verwendet?
-Kinesis Firehose ist ein Dienst für das Batch- und Datenverarbeitungsende, der Daten in regelmäßigen Abständen in S3 speichert. Es ist einfach einzurichten und bietet viele verschiedene Parameter, die angepasst werden können.
Welche Rolle spielt Elasticsearch in AWS und wie wird es in den Projekten verwendet?
-Elasticsearch wird für Textverarbeitung und -abfragen verwendet und ist oft die Technologie hinter Autovervollständigungsfunktionen. In AWS wird es verwendet, um Daten aus S3 zu indizieren und über Kibana zu analysieren.
Wie funktioniert der serverlose Workflow für Aktienpreisbewegungen in AWS?
-Es verwendet CloudWatch Events, um alle Minute eine Nachricht an SQS zu senden, die wiederum einen Lambda-Funktion auslöst, die den Aktienpreis von Yahoo Finance API abruft und in DynamoDB indiziert. Wenn sich der Preis signifikant ändert, wird eine Benachrichtigung über den Simple Email Service gesendet.
Was sind die Vorteile des Einsatzes von Fargate anstelle von EC2 für die Erstellung einer serverlosen Anwendung?
-Fargate ist eine serverlose Möglichkeit, Container auszuführen, ohne dass man sich um Server oder Infrastruktur kümmern muss. Es ist einfacher zu verwenden und erfordert keine Verwaltung von Sicherheitsgruppen oder VPC-Netzwerkeinstellungen.
Welche Ressourcen bietet AWS für den Einstieg in das Lernen und was sind die nächsten Schritte, wenn man in die Tiefe gehen möchte?
-AWS bietet viele Ressourcen wie den Cloud Practitioner-Kurs an. Wenn man tiefer einsteigen möchte, kann man sich mit Projekten vertiefen, die verschiedene AWS-Dienste wie S3, ECR, ECS, RDS, Kinesis Firehose, Elasticsearch, DynamoDB und Lambda verwenden.
Outlines
🌐 Erste Schritte mit AWS: Hosting einer statischen Website
Dieser Abschnitt des Videos konzentriert sich auf die Einführung in AWS und wie man damit beginnt. DerUploader empfiehlt, mit dem AWS Cloud Practitioner-Kurs zu beginnen, um sich mit den verschiedenen Diensten und Konzepten vertraut zu machen. Nachdem man ein grundlegendes Verständnis der Konzepte hat, wird gezeigt, wie man eine einfache Website hostet, indem man eine index.html-Datei in S3 hochlädt. S3 dient als allumfassender Datenspeicher für Rohobjekte verschiedener Größen und Typen. Anschließend wird erläutert, wie man die Website mit Route 53 und CloudFront verbindet, um eine benutzerfreundlichere URL zu erhalten und die Leistung zu optimieren, indem die Dateien an Edge-Knoten weltweit verteilt werden.
🐳 CRUD-Anwendung mit Docker und AWS ECS
In diesem Abschnitt wird ein praktischer Ansatz für das Erstellen einer CRUD-Anwendung (Create, Read, Update, Delete) mit Docker-Containern und AWS Elastic Container Service (ECS) gezeigt. Der Prozess beginnt mit dem Schreiben einer Docker-Datei und dem Hochladen des Images in den Elastic Container Registry (ECR). Dann wird ein ECS-Service eingerichtet, der auf das ECR-Image zeigt. Der ECS-Cluster wird in einem Virtual Private Cloud (VPC) gestartet, um Isolation und Sicherheit zu gewährleisten. Zusätzlich wird eine RDS-Datenbank eingerichtet und eine Anwendungscode geschrieben, die mit der RDS-Instanz interagiert. Ein Load Balancer verteilt den Datenverkehr auf mehrere Container und schließlich wird eine Route 53 DNS-Eintrag erstellt, um die Domain zu dem Load Balancer zu mappen.
🔥 Datenverarbeitung mit Twitter-Streams und AWS Kinesis Firehose
In diesem Teil des Videos wird ein Projekt zur Verarbeitung von Twitter-Streams vorgestellt. Es wird erklärt, wie man einen Python-Skript schreibt, um Daten von Twitter-Live-Streams zu konsumieren. Diese Daten werden dann an einen Kinesis Firehose-Endpunkt gesendet, der für das Batching und die Verarbeitung von Daten gedacht ist. Kinesis Firehose sammelt Daten und leert sie entweder alle fünf Minuten oder alle fünf Megabyte in einem S3-Bucket. Wenn eine Datei in S3 hochgeladen wird, wird ein Lambda-Funktion ausgelöst, die die Dateiinhalte abruft und in Elasticsearch indiziert, was für die Analyse von Textdaten nützlich ist. Elasticsearch wird mit Kibana geliefert, einer Benutzeroberfläche, die es ermöglicht, Trends und Analysen über die Zeit zu visualisieren.
📈 Verteilte serverlose Workflows für Aktienpreisbewegungen
Der vierte und letzte Beispielabschnitt des Videos behandelt die Erstellung eines verteilten serverlosen Workflows zur Erkennung von schnellen Preissprüngen oder -stürzen von Aktien. Hier wird CloudWatch Events verwendet, um eine Minute-Intervalle zu setzen, die eine Nachricht an Simple Queue Service (SQS) senden. Diese Nachricht weckt eine Lambda-Funktion, die den aktuellen Aktienpreis von Yahoo Finance API abruft und die Daten in eine DynamoDB-Datenbank indiziert. Mit DynamoDB Streams und einer weiteren Lambda-Funktion wird überprüft, ob es signifikante Preisänderungen gibt, und bei Bedarf wird eine Benachrichtigung über Simple Email Service gesendet, um den Benutzer über signifikante Aktienpreisbewegungen zu informieren.
Mindmap
Keywords
💡AWS
💡Cloud Practitioner Course
💡S3
💡Route 53
💡CloudFront
💡ECS
💡ECR
💡VPC
💡RDS
💡Lambda
💡Kinesis Firehose
💡Elasticsearch
💡DynamoDB
💡SQS
💡CloudWatch
Highlights
开始学习AWS的推荐课程是AWS Cloud Practitioner Course。
实践学习AWS的第一步是熟悉不同的服务和概念。
通过S3托管静态网站是学习AWS的常见起点。
Route 53用于设置自定义域名并映射到S3托管的静态网站。
CloudFront作为边缘缓存服务,可以提高网站性能。
使用ECS和Docker容器部署CRUD应用。
ECR用于存储和管理Docker镜像。
VPC提供了资源隔离,类似于云上数据中心。
RDS用于设置和管理数据库,如MySQL。
使用负载均衡器在多个容器间分配负载。
通过Lambda函数处理来自Twitter流的数据。
Kinesis Firehose用于批处理和数据流处理。
Elasticsearch用于文本处理和查询。
Kibana提供了Elasticsearch数据的可视化界面。
构建分布式服务器less工作流以监控股票价格变动。
CloudWatch Events用于设置定时触发器。
SQS用于处理应用程序中的背压。
使用DynamoDB Streams检测数据库变更事件。
通过Simple Email Service发送股价变动通知。
这些项目是学习AWS的实用示例,涵盖了多种技术。
Transcripts
what is going on guys this video is
going to be all about learning aws
uh by far the most common question that
i get asked is where do you get started
learning in aws
what are some projects that you can do
to just get yourself familiar with it
and usually what i do is tell people to
start with aws cloud practitioner course
which is a great resource that's offered
by aws
so you can get familiar with some of the
different services some of the different
concepts that you need to know
in order to get started now after you
get familiar with some of the concepts
the next question is how do i actually
learn this stuff in a more practical way
because that course teaches you a lot of
the different things but it doesn't
really tell you
about what are things that people are
actually working on what are skills
or or patterns that people are using or
technologies that people are using
that are useful for me to get a job and
that's what this video is going to be
about it's going to be about
a variety of different projects that you
can learn from a very practical
perspective of what people are actually
doing with aws
in real life uh so that if you're
looking for a job or you're just trying
to hone some of your skills
uh this video is gonna be for you and
the good news is that i'm gonna be
making videos on each of these different
projects i'm gonna put in front of you
today
uh so that if you try it and you get
stuck don't worry i'm gonna show you how
to do it later
okay so that's what this video is going
to be about so let's start with it by
looking at our first
project and we're going to start simple
and progress a little bit
to more advanced topics as we go along
so the first one
is static website hosting this is
probably the most common thing that you
see when people are suggesting
uh learning aws it's just start with a
basic website and host it on s3 so let's
take a look at how this may work if
you're trying to set up this project
so the first thing that you do is you
upload a index.html file to s3
and just as a reminder s3 is your
all-purpose data store for storing
raw objects of a variety of different
sizes and types
for this purpose this would just be an
index dot html file that contains some
just raw javascript you can also upload
upload your
css or javascript assets as well from
there
we get an ugly url after we do that
although this index.html is publicly
accessible the url will be very ugly
from there you want to hook this up to
route 53 and run 53 is where you would
edit your dns settings
for your aws account so you can set up a
domain that maps your custom domain
to your index.html file so you can get
something like
be a betterdev.com and that'll point to
this index.html file now you can stop
there if you want so if a user is
accessing your website or trying to
access your website through chrome
they would just type in your website
name it would go to raw53 that would
resolve to
the s3 file and that's totally fine this
will completely work
i would suggest though instead of
hooking it up directly you use
cloudfront which is a
edge caching service that is very cost
effective
for resources that change uh not so
often so
instead of going directly to s3 you tell
rel53 to go to cloudfront instead
and cloudfront would then synchronize
its data with s3
and it'll deploy your files to different
edge nodes that are located
all across the world for optimal
performance
so this is a very simple thing if you
already know how to do this maybe you
can skip over it but for beginners this
is
probably the most common example of
things that
uh folks tell beginners to learn is just
get familiar with the basic concepts
and the good news is that if you choose
to learn these different things
cloudfront s3 and rel53 that's something
you just need to know if you're going to
be working with aws
so these are skills that are going to
apply outside of this example
all right so let's move on to the next
one here and this is a
crud app and by crud i mean create read
update and delete so let me just erase
all this and reveal what we have okay so
we're going to be using
a docker containers for this so
optionally you can do this with ec2 but
i'm going to do it using ecs which
stands for elastic container service
and this allows you to upload docker
files to the cloud and deploy them using
a managed service called ecs
so how this works from a practical
perspective some of the things that
you're going to need to do
are first you're going to need to write
your docker file then you're going to
upload
the image to ecr which stands for
elastic container repository
and think of ecr kind of like a s3
that's dedicated for docker images
that's essentially what it is
then when you set up your ecs service
what you're going to do is point a point
it to an ecr
image and that's going to be the image
that's used for the service when you set
up your ecs cluster as well you're going
to have to put it within a vpc
and a vpc stands for virtual private
cloud it's essentially a way in which
your resources are isolated from other
aws customers think of it like
a cloud-based data center essentially
that's kind of what it is and you're
shielded from what other customers are
doing no one's able to access your
network
you're able to lock down all the ports
and everything to do
with networking within this application
so you're going to have to launch your
ecs cluster into a vpc
you're going to want to set up a rds
database which in this example
is just using aws rds with mysql which
is very easy to set up
i have a video on setting this up as
well and on ecs actually i actually have
a video on a bunch of these different
things i'll put links in the description
section below
and then you're going to want to code up
your application code so that it
interacts with this
rds instance from there you're going to
want to set up a load balancer
so that you can distribute load across
multiple different containers that
you're hosting and so you would point
your load balancer to your ecs
endpoint and then you again you'd create
a route 53 dns
entry to map your domain name to your
load balancer
and then finally when someone comes
along and tries to hit your website of
course it'll go
to route53 first it'll hit the load
balancer the load balancer will
distribute the request to one of the
containers on ecs
and then you know it'll go to rds and
retrieve your data create your data
update it whatever you're trying to do
and return that all the way back to the
caller so i would consider this to be a
very good example because this is what
people are usually doing these days
serverless really seems to take over
if you're using ecs you can you have two
options running in a serverless way
using fargate
or running it in a non-serverless way
using ec2 i would suggest fargate for
this example just to get familiar
the ec2 one is a little bit more
complicated requires some different
permissions to change uh security group
settings a whole bunch of vpc networking
things that you'll need to know about
there
but i think fargate is a good way to
start again this is
a very useful example a lot of
applications these days are built
using a very similar framework than what
i just showed you here
so definitely definitely definitely
invest the time in learning
through this example okay so let's move
on to the third one which is going to be
more in terms of data processing which
is an interesting one
so i had fun putting this one together
and this is going to be fun when i
actually build this thing
because it's kind of interesting so
let's talk about what this is and this
is going to be aimed at
consuming twitter streams or consuming
tweets i should say rather so twitter
has an endpoint where you can read off
of
live streaming tweets as they come in
and all it requires is setting up a
developer account and getting an api key
so what you would do is just write a
python script that's going to consume
data from that stream and it can be
consuming that data
constantly or on a timer or something
you may choose
and it's going to consume individual
tweets there and then
what we're going to do there to process
this data is we're going to
put the tweets to a kinesis fire hose
endpoint i have a
video on kinesis firehose that you
should check out it's great for batching
and data processing
so what kinesis firehose is going to do
is we're going to set it up so that
every five megabytes worth of data or
every five minutes
whichever comes first we're going to
take all that data and put it into one
file and dump that into s3
so kinesis fire hose by the way it does
all of this automatically like doing
these things is just clicking buttons
it's very very easy
to actually set this stuff up and
there's a whole bunch of different
parameters that you can set as well
so every five megabytes or five minutes
we're going to deliver a file
to s3 we're going to wire up our s3
bucket
that we allocate for this data to
a put notification and that put
notification is going to trigger
a lambda function so this lambda
function is going to be invoked
every time a file is uploaded to s3 now
the
payload of the notification or what's
inside the the notification is just
going to be the location
and the name of the file that was
created so it'll be
like index dot csv
or data.csv or something like that from
there we need to code our lambda such
that
it goes back to s3 and actually pulls
the entire
file contents back and then from there
we can index it in elasticsearch
now if you've never used elasticsearch
text processing is basically what it's
used for a text
kind of querying it's used in a lot of
um
autocomplete databases or you know when
you're searching on google and there's
auto complete you type a character
something comes up elasticsearch is
usually the technology behind the scenes
or something like it
to give you that kind of functionality
so in this case it would be great for
analyzing text so we would take the
entire file contents upload that into
elasticsearch so that we can analyze it
and the cool thing about elasticsearch
on aws is that it comes pre-installed
with kabana
and kabana is like a ui
for this stuff and it'll show you kind
of trends over time
and uh you can also do things like word
clouds or
or different types of queries by
different grouping mechanisms so it's a
very intuitive and easy way to observe
and analyze your data through the
comfort of a ui and all of that data is
going to be sitting inside elasticsearch
so that's how we would access it
so i think this is a pretty good example
that allows you to touch on a lot of
different technologies for the purpose
of data processing
so i'd highly encourage you to check
this one out let's move on to
the fourth and final example now and
yeah so this one's
kind of fascinating i just came up with
this a few
hours ago and this is a distributed
serverless workflow for
stock price movements and the motivation
for this one
is i'm sure some of you were following
gme or gamespot
and uh you wanted to know when it was
rising or falling very rapidly you
couldn't sit at your computer all day
just looking at a stock price uh so the
the idea here is to set up a
distributed uh serverless workflow that
will automatically detect when
jumps or drops occur in very short
intervals
in a stock price okay so i think it's
pretty interesting given what's happened
in the world pretty recently
so what are we going to be using here so
there's a bunch of different pieces
and let me just center this on the
screen so you don't get distracted
okay so first of all we're going to be
using cloudwatch events
and we're going to set them up so that a
cloudwatch event
triggers on one minute timers think of
these things like cloud-based
cron jobs and i have a video on these by
the way that you can learn more
but essentially you set a timer based on
some interval
or some day of the week cadence in this
case we'll just do it every minute
and from there this is going to be kind
of our our way to trigger
our workflow so it's going to be
happening every minute so we're going to
set up cloud watch so that it
sends a notification to sqs and sqs
is called stands for simple q service
it's a great way to back pressure or
have back pressure
for processing in your application so
it's going to signify that there's a job
to be done
we're going to hook up our lambda
function to sqs
so that anytime a message gets put in
sqs our lambda function is going to wake
up and perform whatever it's going to do
now what is it going to do well it's
going to go to
yahoo finance api which has a great api
that you can access
if you're interested in a price for a
particular ticker we can use i know i
said gme so let's just keep on going
uh with that example here so we're going
to query for gme it's going to give us
back things like stock price
number of shares that are available all
that kind of stuff we'll consume that
data and then we're going to
index it in a dynamodb database so this
is going to be a completely serverless
example there's no servers here involved
at all which is
kind of a neat thing and where the
industry seems to be going as a whole
so we're going to index that data in
dynamodb for every
tick basically every one minute we're
going to have a different row
that says the stock price was this at
this time so if we think of like a table
like this
uh if we have t1 for instant one and
then
t2 for instant two or maybe this is
minute one minute two and this is
our price uh maybe we would have
something like eight dollars
in at the first minute and then ten
dollars
in the second minute so you can see here
the jump between these two minutes
is very high it's a two dollar
difference and it and if you convert
this to percent it's a pretty large
percent
so we would want to send out a
notification or somehow become aware
when this happens so the question is how
do we do that
well the way i want to do it just for
the purpose of teaching you more things
in this example and this isn't
necessarily the best way to do it but it
teaches you different concepts
is i would hook up this dynamodb table
using dynamodb
streams and dynamodb streams allow you
to detect
change events so when we wire this up to
our lambda function
every time a record gets inserted in the
database say t2 gets inserted into the
database in this example
we're going to get a notification sent
to our lambda function
and it's going to show us the entire
contents of the record so
all this data that exists in the row so
we're going to take that data
then we're going to query again we're
going to go back to dynamo
but we're going to look for t1 which is
the minute before it
and then compute this difference and if
the difference is significant enough
maybe beyond a certain percentage
we're gonna send a notification through
simple email service
to our phones so we get a text message
or an email to say hey daniel the stock
price dropped or the stock price raised
by x amount go and buy or sell some
shares
so i thought this was a pretty
fascinating example so i think these are
four examples that are great for
learning aws from a very
practical perspective all these things
are things that you would probably have
to do at some point in your career maybe
not interact with stock prices or the
twitter api
but the technologies that you're going
to be using are very
interchangeable with different use cases
so i'd highly suggest you try some of
these projects
out and if you get lost don't worry like
i said i'm going to have videos coming
out
on each of these different projects and
i'm going to be doing them and walking
you through how to do them
so look forward to that as well if you
like the video don't forget to like and
subscribe and i'll see you next time
thanks so much guys
تصفح المزيد من مقاطع الفيديو ذات الصلة
5.0 / 5 (0 votes)