What is Edge Computing for Data & AI, and Should You Be Interested?
Summary
TLDRBen Dunmeyer, a data analytics consultant at Thorogood, discusses the impact of Edge Computing on data and AI. He explains the concept of processing data near its source to reduce latency, enhancing real-time decision-making. Dunmeyer explores use cases like retail inventory tracking, workplace safety, and healthcare monitoring. He also delves into the practical challenges of implementing Edge Computing, such as hardware requirements, security, and aligning with existing data strategies. The talk concludes with thoughts on the evolving landscape of providers and the importance of integrating Edge Computing into long-term data strategies.
Takeaways
- đ Edge Computing brings computation and storage capabilities closer to where data is generated, reducing reliance on central data hubs.
- đ It enables low latency or real-time data processing and analysis, which is crucial for applications requiring instantaneous decision-making.
- đ The technology is particularly impactful in industries like manufacturing, healthcare, and retail, where real-time data can optimize operations and enhance safety.
- đ§ Efficient use of physical assets is achieved by utilizing edge devices to perform computations and analyses during idle times, maximizing ROI on infrastructure.
- đ The ability to combine new and diverse datasets collected through edge devices can lead to novel insights and improved decision-making across various sectors.
- đ ïž Edge Computing allows for the filtering and prioritization of information, reducing the need to transfer all data back to a central hub and thus optimizing network usage.
- đ It can operate effectively in environments with limited or unreliable network connectivity, making it suitable for remote locations or situations where network infrastructure is a constraint.
- đĄïž Security and data sovereignty become critical considerations as sensitive data and intellectual property may be stored and processed on edge devices.
- đ The technology's potential is heavily influenced by the capabilities of cloud platforms, which provide the infrastructure and tools necessary for edge computing implementations.
- đ§ Successful integration of edge computing requires a strategic approach that considers technical capabilities, business objectives, and the physical realities of deploying and maintaining edge devices.
Q & A
What is the main focus of the discussion in the transcript?
-The main focus of the discussion is Edge Computing and its implications on data and AI, particularly looking at its potential impact on the data and AI landscape.
What is Edge Computing?
-Edge Computing refers to the ability to have both compute and storage within devices or where data is being generated, as opposed to relying solely on a central data hub or infrastructure.
What is the significance of low latency in Edge Computing?
-Low latency in Edge Computing is significant because it allows for real-time or near real-time data processing and analysis, which is crucial for applications that require instantaneous decision-making.
How does Edge Computing relate to AI applications?
-Edge Computing can enable AI applications by providing the necessary computation and storage capabilities close to where data is generated, allowing for faster and more efficient AI model training and inference.
What is the role of Thorogood in the context of this discussion?
-Thorogood is an independent Global Professional Services firm specializing in business intelligence and analytics strategies, solutions, and services. They offer a full range of data analytics services and work with various technologies to provide clients with solutions that best suit their needs.
What are some potential use cases for Edge Computing mentioned in the transcript?
-Some potential use cases for Edge Computing mentioned include real-time decision making, efficient usage of physical assets, combining new and diverse data sets, filtering information, and reducing dependency on environmental or physical limitations.
How does Edge Computing impact traditional business intelligence reporting?
-Edge Computing can impact traditional business intelligence reporting by allowing for real-time or near real-time analytics, which can provide more up-to-date insights and potentially lead to better decision-making.
What are the challenges associated with implementing Edge Computing?
-Challenges associated with implementing Edge Computing include physical practicalities such as procuring, installing, and maintaining the equipment, ensuring connectivity, addressing security concerns, and considering data sovereignty.
How does Edge Computing fit into an organization's overall data and AI strategy?
-Edge Computing should be considered as a facet of an organization's overall data and AI strategy, where it can be opportunistic for specific use cases or part of a wider rollout. It's important to align it with business and technical strategies and consider how it will work with existing systems.
What are some of the core capabilities needed to operate and manage Edge Computing?
-Core capabilities needed to operate and manage Edge Computing include operating and running systems, data systems, devops, data engineering, data science, machine learning, data collection, monitoring, and potentially mlops (machine learning operations).
How might the landscape of providers for Edge Computing look like in the future?
-The landscape of providers for Edge Computing may include the mainstays of the data and AI space, new players, conglomerates, or partnerships. It will be interesting to see how the hardware requirements and market potential shape the competition.
Outlines
đ Introduction to Edge Computing and Its Impact on Data and AI
The speaker, Ben Dunmeyer, a data analytics consultant at Thorogood, introduces the topic of Edge Computing and its potential implications on data and AI. He outlines the agenda, which includes sharing insights into Edge Computing, its current implementations, and its future impact on data and AI landscapes. Ben emphasizes the importance of understanding Edge Computing, particularly in the context of low latency data processing and analysis, and how it can enable real-time decision making and efficient use of physical assets. The discussion also touches on the need for filtering and prioritizing data to optimize business intelligence and reporting.
đ§ Edge Computing in Practice: Real-World Applications and Opportunities
This section delves into practical applications of Edge Computing, such as enhancing workplace safety, optimizing retail operations through inventory tracking and promotion analysis, and improving manufacturing processes. The speaker highlights the ability of Edge Computing to collect new and diverse data sets, which can lead to fresh insights and improved decision-making. It also addresses the potential to reduce dependency on environmental factors, such as limited network connectivity in remote locations, by processing data locally. The paragraph provides various examples to illustrate the versatility of Edge Computing across different industries.
đ„ Emerging Use Cases of Edge Computing in Healthcare, Transportation, and More
The speaker explores emerging use cases for Edge Computing, including patient monitoring in healthcare, traffic management, and autonomous vehicles in transportation. These examples showcase the potential for Edge Computing to enhance real-time responses and decision-making in various sectors. The discussion also considers the future growth of Edge Computing as technology advances and the volume of capturable data increases. The speaker invites the audience to consider how these use cases might apply to their own organizations or industries and to think about the potential long-term strategy involving Edge Computing.
đ ïž Implementing Edge Computing: Technical and Strategic Considerations
In this part, the speaker discusses the implementation of Edge Computing, focusing on integrating edge devices with existing data and AI architectures. He uses a manufacturing plant as an example to illustrate how data collected by edge devices can be used to monitor equipment, identify potential faults, and improve quality control. The speaker emphasizes the need for a robust data platform to collect, transform, and analyze data from edge devices effectively. He also touches on the importance of retraining models and managing operations on edge devices, suggesting that successful implementation will require a combination of data engineering, data science, and DevOps practices.
đ The Future of Edge Computing: Challenges, Strategies, and Provider Landscape
The final paragraph addresses the challenges and strategies associated with Edge Computing, such as physical practicalities, security concerns, and data sovereignty. The speaker also considers the fit of Edge Computing within existing data and AI strategies and the potential for it to be either opportunistic or a core part of a broader technological rollout. The discussion includes the evolving landscape of providers and the possibility of new players entering the market. The speaker concludes by encouraging the audience to think about how Edge Computing might fit into their long-term data strategies and to consider the practical steps for implementation.
Mindmap
Keywords
đĄEdge Computing
đĄData AI
đĄLatency
đĄThorogood
đĄReal-time Analytics
đĄData Sovereignty
đĄDevOps
đĄMLOps
đĄIoT (Internet of Things)
đĄData Lake
Highlights
Introduction to Edge Computing and its implications on data and AI.
Definition of Edge Computing as the ability to have compute and storage within devices where data is generated.
Discussion on the potential for low latency or no latency capabilities in data processing and AI applications.
Introduction of Thorogood as an independent Global Professional Services firm specializing in business intelligence and analytics.
Explanation of how Edge Computing can enable real-time or near real-time analytics.
The importance of efficient usage of physical assets and how Edge Computing can optimize it.
Opportunities for combining new and diverse datasets with Edge Computing to uncover new insights.
The ability to prioritize and filter information with Edge Computing to reduce unnecessary data transfer.
How Edge Computing can reduce dependency on environmental or physical limitations.
Examples of Edge Computing in retail, manufacturing, workplace safety, and healthcare.
The intersection of Edge Computing with data engineering, data science, and MLOps.
The role of cloud vendors in providing infrastructure and software for Edge Computing.
Challenges in implementing Edge Computing, including physical practicalities and security concerns.
Strategic considerations for integrating Edge Computing into existing data and AI strategies.
The future landscape of providers in the Edge Computing space and potential partnerships.
Practical steps for getting started with Edge Computing and aligning it with long-term data strategies.
Transcripts
foreign
[Music]
thanks everybody for coming along today
we're going to be talking about Edge
Computing and particularly looking at
its implications on data Ai and really
just thinking about where this kind of
functionality capability could go pretty
simple agenda really aiming to share a
little bit about Edge Computing kind of
understanding of it where we're starting
to see some like sorts of
implementations with customers or at
least talking about it and gives kind of
exposition as to like how we think it
could and probably will impact the data
and AI landscape just quickly to
introduce myself and then Thorogood so
my name is Ben dunmeyerman data
analytics consultant I'm based in the US
so for those of you that don't know
Thorogood we're an independent Global
Professional Services firm specializing
in business intelligence and analytics
strategies Solutions and services or
global company with offices in Asia
Europe and the Americas as I mentioned
I'm in Philadelphia all our Consultants
are recruited and trained in the same
way to develop a unique mix of skills
blending business understanding in the
form of industry and functional
experience with a strong technical
aptitude and deep understanding of
analytical tools and techniques and we
do offer a full range of data analytics
Services you can see some of those on
screen in terms of the technical
Technologies we work with we are an
independent consulting firm meaning we
don't work with one specific technology
but really work across the best in breed
tools and platforms across the industry
so that's big players as well as Niche
providers and really what we want to do
is partner with with these key players
in the market to provide our clients
with a solution that best suits their
needs and as we talk about Edge
Computing there's probably Technologies
and vendors that are on the screen but
will evolve into the the future but
we'll keep it in context so starting out
just giving an idea and maybe a little
bit of a definition of edge Computing
and where we see the intersections with
particularly data and AI applications
won't really get into things like
Hardware or that's configurations so
again starting simply grabbed a few
things from the public domain to kind of
present the topic and I guess the
concept of edge Computing is relatively
Sim simple with just like anything that
and kind of more than meets the aisle a
lot to actually implement it but the
fundamental idea is that within devices
and the technology advances we have the
ability to have both compute and storage
within devices or where data is being
produced or generated as opposed to just
in some sort of central data Hub or
Central kind of firewalled
infrastructure so the idea of a sensor
in an office that is gauging temperature
number of people in a room or an
autonomous vehicle or piece of equipment
in a factory they're all kind of
mechanisms or Machinery or devices that
can generate data so that idea of edge
Computing is bringing the ability to
have some level of computation some
level of storage kind of on or right
next to that device a big idea is being
able to achieve a low latency or really
no latency capabilities with data
processing data analysis and and
potentially AI applications to see maybe
the diagram on the right kind of draw
always had a couple number of different
topics some examples of these sorts of
devices and it's really thinking about
like what do I want to accomplish and
what are the latency requirements for it
and and thinking about the kind of
business implication of it is there a an
opportunity or a need for Real Time near
real time or actual real-time analytics
or is it that and also some some needs
for kind of traditional business
intelligence reporting or operational
reporting so kind of in principle it's
introducing this concept and it's been
around for for a number of years I think
getting more and more popular and
practical
um it's putting the the technology and
capabilities in place where data is
being produced so we can do potentially
some of the same sorts of operations or
analyzes that we do now in a you know
centralized data store centralized data
Hub that we curate and build really I
guess I want to think about like how we
might apply these sorts of scenarios
what scenarios they might apply AI for
and what the value add could be so here
I've sort of laid that out um kind of
what is the opportunity and then how
does Edge computer Computing sort of fit
that bill or allow us to do it so the
first one I mentioned this is now a
couple times is real time or
instantaneous decision making as so
really and again kind of hits the nail
on the head with the the ability to have
some level of computation and storage on
the device or right next to the device
so to speak that doesn't require a
movement through networks to some sort
of data store and other batch or
streaming processing so effectively
allowing us to do low latency or really
maybe no latency analysis as the data is
being produced efficient usage of
physical assets so what I really mean by
this if we I guess think about
Technology and Manufacturing sites or
smart buildings or kind of any number of
other kind of physical things that kind
of Hardware if we use that term pretty
Loosely is like going to be I title at
times there's not always going to be
someone in a roommate given piece of
equipment is not always going to be
running but if it has these capabilities
of edge compute within it or as part of
its structure framework how do you kind
of use them to best affect
um like if they're there and they're
just sitting idle we want to be using
them maybe running some analyzes or
retraining a model or something like
that so it's it's I guess the
opportunity is getting the most out of
Investments for their physical
Investments or data Investments and it
like thinking about how we can use those
Investments all the time and even when
they're idle it's just something that's
I think near and dear to us in the data
analytics Community is the ability to
combine new and diverse data sets to get
to New Perspectives or come up with new
analyzes or uncover insights that have
previously we haven't thought about or
hadn't been able to prove or disprove
and a similar idea here with Edge
Computing I think the the angle with
Edge Computing is is the kind of nascent
or new data sets that can be acquired if
able to tap into this capability and a
lot of this is is probably technological
advances in building these components or
equipment that is able to capture
information whether it's like Biometrics
or information about a vehicle or again
in a plant a couple of these common
examples I've been using
so like with the opportunity to collect
new data what new things can we come up
with can we improve decision making in
like in any number of areas um and I'll
go through some examples later so the
opportunity to collect new and diverse
data that we can use in like lots of
different ways the ability to kind of
prioritize our filter information so we
think about any sort of routine you want
to know should I be doing this should
this data that I'm collecting from X Y
or Z do I need to bring it back into my
central Hub do I need to store it do I
need to process it or should that
investment or that time be spent
somewhere else so with with the idea of
compute kind of where the data is being
collected we might be able to make
decisions and to exclude erroneous data
or do some like initial evaluation as to
what information like has any purpose
versus what things we you know don't
store do store so I think it gives us a
little bit more autonomy and and maybe
auditability to decide for things like
Internet of Things sensors what
information do we want to bring back
into a central data server because
that's going to be an investment we have
to set up those kind of physical
connectivity if to have the routines
that do it they have to be maintained so
can you filter out unnecessary
information for example in the last
example I have here kind of thing to
call out is the ability to reduce
dependency on any sort of environmental
or physical limitations or factors and
this one's a little bit severe probably
in some instances there's probably just
a few actual scenarios that come to mind
but if we think about an oil rig or
remote locations across the world or
somewhere in the ocean like an oil rig
not necessarily have that high speed
Network you might have you know High
latency Network or a little bandwidth
very likely so the average Computing
giving us some opportunity to do some
computation some data storage where we
don't have the ability to pull all that
information or a lot of that information
into some sort of central place
and this one's pretty severe it's
probably a few instances of it but I
think it's it's pretty interesting so
I've alluded to a few and I think
there's some kind of obvious examples of
this and I'd only expect this to grow
over time as just like technology
advances uh Hardware advances and our
the amount of information we capture
just becomes and people have seen
studies exponentially grows so a few
examples here kind of broken them into
some some groups can maybe briefly hit
on a few of them or maybe all of them
just because they get some interesting
examples and I might be thinking about
where it applies to your organization or
the industry that you play in
so starting from the left retail
scenario where I've done a lot of of
work and a lot of my expertises and so I
think retail both on the front and the
store and in the the manufacturing side
which is just to the right but things
like stocking tracking stocks so where
are we low on inventory whether that's
in the warehouse or or in the actual
brick and mortar stores potentially the
ability to maybe not in the moment but
pretty quickly get information about
promotion Effectiveness so if we run a
promotion in a certain display or in a
certain aisle is the foot traffic in
that aisle the days or times when the
permission is active or they increase or
they decrease or stay the same and like
the ability to see that and potentially
make almost real-time decisions is
pretty wildly interesting uh workplace
safety so you can think about that
across various different construction
zones and any other I guess
manufacturing the top here I made I said
the example earlier about it like oil
rig but sorts of isolated areas again
this is a pretty severe example because
there's not that many but how can how
can these scenarios be like better
managed or how can we use data analytics
in the scenario where we don't have the
ability to pass a high amount of data to
a central data store and get it back and
make any use out of it Healthcare so you
might think about patient monitoring so
being able to pick up on the numerous
biometric and other signals that you can
get from monitoring a patient something
like room availability in a hospital or
an office Transportation so autonomous
vehicles one that is kind of up into the
news right now traffic management
monitoring emergency vehicle responses
or even Pathways so transportation is an
interesting one that one that a lot of
us experience every day and a few other
words here I will probably leave them to
to read through but I think number of
interesting use cases to think about it
as I said I think it will only grow as
technology advances as we're able to
collect this data from more diverse
places and it's kind of interesting to
see where it goes and ultimately if the
hardware and technology is in place to
be able to do it can we actually take
advantage of it so I'll try to frame us
a little bit in how the kind of concepts
of educated Computing and some of these
use cases which are kind of a little bit
ubiquitous or generalized how might it
impact data and again not going to jump
into any details here of how to do it
but I really wanted to Think Through
like what are the components and how
what may adopt or use the theories and
the practices that we've developed over
years and continue to develop and kind
of building data architectures and these
use cases which present some physical
limitations and like hardware and what
we can do I'm in some kind of practical
limitations I suppose Give an example
and again not to any level of detail but
but how hopefully frames a little bit
and gets us thinking about like what we
need to make this a reality or potential
reality so I'll use the example of a
manufacturing plan I think it's a pretty
on the nose example just given the
number of different use cases like
monitoring equipment for controlling it
but also for identifying things that
have a high probability of failure or
faults given amount of time or usage you
know pretty common use cases other
things being like climate control or
building control which is just not
necessarily manufacturing identifying
quality issues so if you can spot
problems with Goods ahead of time using
some sort of video surveillance or even
information coming from the Machinery
that is producing the products we can
get ahead of quality issues and I'm sure
an endless list of theories and ideas
but maybe let's for the example think of
a classification or probability model
that we've developed and want to apply
in order to identify faults or failures
or the likelihood of them on either
subset equipment or certain you know
part of the process whatever it might be
so we have that and we have our Edge
devices across this manufacturing plant
and and we of course have our Enterprise
data and eye architecture doesn't really
matter the technology stack likely
multi-cloud multifaceted First Step but
luckily we are bringing this data into
our data platform so I've assumed some
sort of data like here and again this
I'll get to the actual Edge part of it
but with the edge device or the internet
of thing or things device we're able to
pass through Telemetry and other
information in some sort of structured
way and we'd land that data so whether
it's files or presumably in files we
landed somewhere in a data Lake and then
we have a pretty kind of natural way or
or learn way of curating that data
whether it's streamed or batch so we
might organize the data apply
restatements do some data cleaning
whatever else it might be that that
makes data either more presentable or
easier to interpret or whatever might be
often kind of interpreting logs is
something that we run into we know how
to do that we do that all the time
that's the bread and butter of a data
and AI architecture collecting data
transforming it and then using it for
various different things so it's the
same more or less here we're not
thinking about like a high velocity of
data and I'll kind of form it with the
picture and come back to it so with that
curated data we can do some data
analysis we'll do a reporting we do
analytics on it whatever else so we've
we've kind of built something that we're
pretty familiar with and we'll assume
that we're not necessarily feeding a
stream of data and we don't need to do
that in this case I think this could be
a batch or it could be every so often
collected from the The Edge devices
where we have the facility so the kind
of the scalability and elasticity of
cloud and and the kind of software to do
so we'll build and train a model so
again we have history data we have
whatever hours and hours or months and
months of telemetry data from these Edge
devices and we're able to develop a
model that is helping us to predict
either the probability or or the
likelihood of a fault or failure for a
piece of given equipment so again I
wouldn't expect us to be building or
trading this model I guess you could
train it on the edge device but you're
first using that device to collect
information maybe to do some potential
structuring of it but but then
developing a model and then deploying
that model so having some mechanism to
physically push that model into or onto
the edge device kind of completing the
feedback loop so to speak and the idea
here really is we're taking advantage of
our known cloud and data architecture
where we can build these sorts of
solutions and able to then kind of move
some of the compute or storage
requirements onto the edge device so we
don't need this data to come in at a
super high velocity all the time into
our kind of main platform we're able to
have a model on the device that
interprets this data and whatever spits
out a result on a gauge or a user
imagination exactly how it transmits
this information and then the whole idea
of kind of retraining so we're at
whatever intervals sending new Telemetry
data back into this platform or maybe we
have some ability given the compute
possibilities on the edge device to do
some kind of example retraining or to do
some monitoring and log and detection of
any skew or detection of any drift of
the model for example on the edge device
and then we take that information along
with new data send it back through our
kind of loop retrain a model or whatever
it is and then redeploy it practically
we're not going to do everything on an
edge device but we're we're taking
advantage of a number of different
facets so doing a bit more on that
device and integrating it with our kind
of tried and true platform the other
things that really stick out to me are
some of our core capabilities for
operating and running systems Data
Systems and if we think about doing
something like this if we're relying on
edge devices to run a manufacturing
plant you've got safety to consider you
are you're the livelihood of the
businesses is predicated on this you
want to have some reliable consistent
routines that allow you to manage this
so introduce the Ops you can put
different acronyms in front of it but
devops and envelops particularly in
these instances so do you have the right
devops set up to be able to manage the
data engineering and data collection
routines the moaps configuration to be
able to manage the model to be able to
interpret information coming from the
edge device to understand drift
understand log information collect
collect the data and retrain the model
or just rescore the model and and then
what what does it actually look like on
the edge device how are we deploying
code more or less to that edge device
I'm showing practical ways to to do that
but what does it look like there how are
we you know introducing some basic
routines to either clean data or produce
an analysis or even produce an output
that then gets visualized in some sort
of screen at the manufacturing Planner
on the device or whatever it might be so
I'm starting to think about not just
like the data manipulation the data
Transformations the routines but how do
we actually can manage it and again
we'll we'll try to answer it here it
won't get into any details but kind of
where my head goes looking at the tool
sets now and I think this is an area
where we're going to see the most
development the the most uh growth
really always not always start but
naturally start with the big cloud
vendors and and usually the leaders in
the space particularly for data and AI
so Azure AWS and and gcp a few other
players out there as well but the the
big players I've just listed out here
the the technology Stacks if that's the
right word that it seems that's how the
the software vendors are positioning
them as kind of stacks of edge or stacks
of iot capabilities and and again just
list of the names here for for awareness
really really what it looks like these
the cloud players are doing is they're
figuring out the infrastructure side of
things and that'll be interesting play
does does Microsoft produce devices for
for these different use cases this AWS
gcp do the same sort of thing or are
they going to create the software that
can get installed on edge devices which
are created by traditional manufacturers
so it'd be really interesting to see
certainly across each of these platforms
and the build out of the technology
stack and and really it's a an
infrastructure capabilities and then
leaning into the data analysis machine
learning data collection monitoring
capabilities that will needed to
ultimately kind of see a edge Computing
Vision come to life some of the kind of
questions that come to my mind things I
was asking myself list them out here do
we see the big Hardware producers Dell
IBM others playing in this place or they
are they you know they're expertise in
Hardware are they going to develop the
hardware the actual Edge devices partner
with the cloud underwear or try to do it
themselves do we see digital native
companies innately able to to do this
and in that case it may not we may not
be thinking physically as much but are
they they able to either produce maybe
produce products that we're adopting and
using these scenarios fully is it kind
of fully packaged together if we think
about those that are actually
manufacturing this equipment so building
sensors or building the edge devices
that go and sit on Machinery or a
turbine or whatever it is are they
partnering with one of the cloud vendors
are they doing it themselves so just I
think it's interesting to see I don't
think we quite know a lot will probably
depend on the how lucrative this this
space is who jumps into it who's able to
make it a practicality and I think the
world we live in that right now with
microchip and like precious metal
challenges and supply chain does that
slow this down or does that construct
the the players that could potentially
compete here so this is I think an area
we have to we have to monitor it to see
what the cloud software data vendors are
doing and see who else either enters the
space the other takeaway I have is is I
bet it's not so different but but it's
not so different than what we've always
had to do it's just happening in a
different place and I suppose introduce
producing some physical practicalities
and some the need to be responsible in
the usage of The Limited compute or
limited storage that we have but really
it's that intersection of data
engineering data science visualization
devops mlops and probably other bubbles
you could put on here but it's it's
getting all these right into a single
solution to make it a a practicality you
know if we have inefficient routines
that collect data and transform it we're
not going to actually take advantage of
that low latency with no latency
connection if we have routines that take
forever to run
um if we don't have the appropriate ml
or devops capabilities to kind of make
changes in in real time or collect
enough information to adjust
infrastructure configurations or
whatever it might be it's not actually
going to to work so we've got the same
foundations of data and AI existing here
but it's obviously learning about the
nuances or the the keys to make
something like an edge Computing use
case or Edge Computing implementation
successful and also lay is what are my
basis objectives what's my strategy my
technical strategy my business industry
strategy and with someone's physical
footprint there's there's a obvious
capital expenditure here do you retrofit
existing facilities or equipment with
these devices is it only net new you
know smart buildings or smart grids or
new vehicles that you apply these
sensors to so there's questions there as
to like what's the investment where does
this apply and how do you just like we
always are thinking about the merging of
Legacy systems with new systems which
just is a continuous cycle how is it how
does it work here when um pretty diverse
I'm a kind of diverse new application
so I'll close that here just about out
of time and just leave a few thoughts
here I didn't really talk about the
challenges I guess some of them are are
obvious but like how do you come up with
some approaches that that do combat the
kind of known challenges Edge Computing
and and the ones that we kind of expect
to to see as as it actually becomes a
reality so there's the physical
practicalities like I mentioned of
actually procuring the equipment
installing it maintaining it over time
you do have to have some level of
connectivity what about security uh so
if you just kind of have data and
potentially some intellectual property
stored on these devices do you need to
secure those devices I don't know what
to think about data sovereignty at all
probably what's the fit with your
existing data AI strategy I mentioned
this on the previous slide or two is
like where does your business want to go
where you're going with your technology
strategy and and how do they work
together again I think Edge Computing is
a facet of one's overall data AI
strategies it could be opportunistic of
use cases it could be a kind of wide
rollout and a big bet that someone makes
but I think if you you can get that
right if you can figure out the
scalability of kind of a base Cloud
architecture that is cloud does enable
us to do if you can get that right with
Edge Computing with these other new and
interesting capabilities which are you
know next year it'll be something
different like if you're able to scale
those together and effectively and with
minimal rework I think that's the real
differentiator I mentioned a little bit
what's the what's the landscape in terms
of providers look like are we going to
see the the Mainstays and the kind of
data and AI space like lead here will be
new players will be kind of
conglomerates or Partnerships it'll be
interesting to to see given the the
uniqueness potentially of the of the
hardware that's needed and then really
how would you get started or I'm sure in
certain industries it lends itself
naturally to that are already doing this
and how do you accelerate how do you
make better use of the investment how do
you kind of make use of idle time all
those sorts of things and and again
maybe impossible question to answer but
an area to start to start thinking about
either use cases you have or opportunity
communities that you see to implement
this or an assessment of what your 10 15
20-year data strategy might look like so
I'll close out here I just listed out a
few areas that they're a good can help
and again and kind of some of the
exploration or thinking around Edge
Computing but a number of other things
and you can see those listed here kind
of a full range of services that we
offer of course do do get in touch if
you want to talk talk about this topic
or any other topics always happy to
discuss share perspective on it thank
you very much for attending appreciate
your time today and have a good one
thank you
[Music]
Voir Plus de Vidéos Connexes
7 IoT Technology Trends in 2024 | Future of Iot Technology
Silicon ANGLE the CUBE Executive Series: AI Infrastructure Silicon Valley
7 High-Income Skills that AI Canât Replace in next decade I top it skills for 2030 @CareersTalk
Public Cloud Explained
Introduction to the concept of Data and Database Management System
El proceso de Knowledge Discovery (KDD)
5.0 / 5 (0 votes)