Common business use cases for generative AI
Summary
TLDRIn a panel discussion, industry experts from Google, Vodafone, and Blue Core shared insights on the application of generative AI in business. They highlighted the importance of prioritizing customer value and technical feasibility in AI projects. Notable use cases included Alpha Fold's impact on drug discovery, product cataloging, and customer service operations. The panel emphasized the transformative potential of AI, the need for continuous learning, and the value of hands-on experience in understanding its practical applications and limitations.
Takeaways
- ๐ Prioritization of AI use cases focuses on customer value, friction points, technical feasibility, and aligning with internal research innovations.
- ๐ฌ AlphaFold on Google Cloud is a prime example of AI research translated into a practical solution, accelerating drug discovery by understanding protein structures.
- ๐๏ธ Product cataloging is a significant use case where AI helps in categorizing products for search and creating website copy, improving customer experience and retail performance.
- ๐ฌ Customer service operations benefit from AI through conversational agents and internal support for SRE teams, enhancing post-mortem search and summarization.
- ๐ Generative AI is utilized to improve features on platforms, increase internal efficiencies, and better serve customers by standardizing product data and mapping it to taxonomies like Google's.
- ๐ Telecommunication companies leverage AI for network deployment, predictive maintenance, and customer call analysis, translating and summarizing calls to understand customer issues and improve service.
- ๐ Technical solutions in AI involve embedding models, vector databases, and vector search engines to process and retrieve unstructured data efficiently.
- ๐ The application of Transformer models extends beyond text and images to non-traditional use cases, such as analyzing and predicting user events in software applications.
- ๐ Global deployment of AI solutions requires consideration of scalability, replicability, data security, and compliance with local regulations like GDPR.
- ๐ Business value from AI is measured in terms of increased employee productivity, improved user experiences, and the generation of new insights or ideas that were previously impossible.
- ๐ก The importance of continuous learning, hands-on experience, and staying up-to-date with the latest AI research and technologies is emphasized for maximizing the potential of generative AI.
Q & A
What is the main focus of the session on common business use cases for general AI?
-The session focuses on discussing and understanding the top business use cases for general AI, how to prioritize them, and how to get started with their implementation in various industries.
Who are the panelists in the session and what are their roles?
-The panelists include Nema Dakiniko, a product manager at Google for AI portfolio, Ignacio Garcia, the global director of data analytics and AI and CIO of Vodafone Italy, Arvind Christian, who runs engineering, data science, and solution architecture teams at Blue Core, and Donna, who leads the Technical Solutions management team for generative AI at Google Cloud.
How does Google prioritize its AI solutions for customers?
-Google prioritizes AI solutions by focusing on what will add value to the customers, identifying friction points they face, and considering the technical implementation of the solutions. They also look internally to their research teams to bring innovations to customers.
Can you explain the significance of AlphaFold on Google Cloud?
-AlphaFold is a significant research project by DeepMind that focuses on solving the protein folding problem. It's important because understanding a protein's structure can lead to the development of drugs that modulate its function. Google Cloud has operationalized this research into a solution that adds value to healthcare organizations by making it reproducible, scalable, and cost-effective.
What are some of the technical patterns observed in different AI use cases?
-Technical patterns include using embedding models to process unstructured data and index it with a vector database, using coding models like Codex to generate SQL or Cypher for database access, and applying Transformer models to non-image, non-text use cases.
How does Vodafone Italy utilize AI in its operations?
-Vodafone Italy uses AI for strong model analysis to understand customer behavior and offer models, network deployment for capex efficiency, predictive maintenance, and summarization of customer call data to understand and improve customer service and reduce detractors.
What are the key technical considerations when deploying AI solutions globally?
-Key considerations include scalability and replicability, ensuring data security, adhering to local regulations like GDPR, avoiding bias in models, and creating policies to maintain these standards.
Why did Blue Core choose Google Cloud for its AI projects?
-Blue Core chose Google Cloud due to its innovation roadmap, co-innovation approach, data governance and security features, and the performance they observed with Google's AI technologies compared to other models.
What are the business benefits of using generative AI?
-Business benefits include increased employee productivity, more intuitive and better user experiences, new insights or ideas that were impossible before, and cost-effectiveness in problem-solving and scaling solutions.
What advice does the panel have for businesses looking to implement generative AI?
-The advice includes experimenting and iterating with AI, investing in foundational architecture, creating a safe environment for experimentation, not restricting innovation, and educating the entire company about the potential of AI.
How does the panel suggest businesses should approach the rapid evolution of AI technologies?
-Businesses should stay humble, focus on short-term use cases, invest in tooling and platforms to support AI, and ensure that their AI initiatives are not centralized to avoid stifling innovation.
Outlines
๐ค Introductions and Panel Discussion on AI Use Cases
The panel discussion begins with introductions of the speakers: Nema Dakiniko, a product manager at Google AI, Ignacio Garcia, the global director of data analytics and AI at Vodafone Italy, Arvind Christian, head of engineering at Blue Core, and Donna, who leads the Technical Solutions management team for generative AI at Google Cloud. The panel aims to discuss common business use cases for AI, with a focus on how to prioritize and implement AI solutions effectively. The conversation starts with Donna explaining the prioritization process based on customer value and technical feasibility, and then moves on to specific examples like Alpha fold on Google Cloud and its impact on drug discovery.
๐ Product Cataloging and Customer Service Operations
The discussion continues with the application of generative AI in product cataloging and customer service operations. Arvind and his team have worked on categorizing new product labels for search, improving website content, and supporting SRE teams with post-mortem search and summarization. Ignacio shares Vodafone Italy's experience with AI in customer service, emphasizing the importance of understanding customer needs and improving interactions through AI. The conversation highlights the benefits of using AI to standardize product data and the technical considerations for deploying AI solutions globally.
๐ ๏ธ Technical Design and Implementation of AI Solutions
Kevin and Donna discuss the technical aspects of designing and implementing AI solutions. They cover patterns like using embedding models and vector databases for unstructured data, code generation models for database access, and applying Transformer models to non-text use cases. The conversation also touches on the optimization of the inference pipeline for Alpha fold, the importance of reproducibility and experiment tracking, and the use of Vertex AI for automating processes and ensuring security.
๐ Global Deployment and Business Value
The panelists delve into the challenges and strategies of deploying AI solutions worldwide. They discuss the need for scalability, replicability, and compliance with local regulations like GDPR. The conversation highlights the importance of data logistics, policies, and the use of Vertex AI for secure and efficient model deployment. The panelists also share their experiences with the business value of AI, including improved accuracy, cost-effectiveness, and the ability to understand customer needs better.
๐ Insights and Learnings from the Journey with Generative AI
The panel concludes with insights and advice on working with generative AI. Kevin shares his learnings about the importance of continuous learning, accessibility of AI tools, and the value of research papers. He emphasizes the need for a safe environment for experimentation and the transformative potential of AI. Donna advises on experimenting, iterating, and investing in foundational technologies to combine AI models with proprietary data. The panelists stress the importance of staying humble and focused on short-term use cases while preparing for the rapid evolution of AI technology.
Mindmap
Keywords
๐กAI
๐กProduct Manager
๐กData Analytics
๐กGenerative AI
๐กML Infrastructure
๐กCustomer Service Operations
๐กTechnical Solutions
๐กData Structures
๐กReplicability
๐กGoogle Cloud
Highlights
Session on common business use cases for general AI
Nema dakiniko, a product manager at Google, introduces the session
Ignacio Garcia, global director of data analytics and AI, discusses Vodafone's AI initiatives
Arvind Christian shares Blue Core's work in engineering, data science, and solution architecture
Donna leads the Technical Solutions management team for generative AI at Google Cloud
Prioritization of AI use cases based on customer value and technical feasibility
Alpha fold on Google Cloud as a notable AI research breakthrough
Product cataloging as a key business use case for AI
Customer service operations improved with AI conversational agents and internal support
Vodafone Italy's use of AI for call center call summarization
Technical design patterns for different AI use cases
Importance of data structures and taxonomies in AI solutions
Global deployment of AI solutions with considerations for scalability and replicability
Use of generative AI in retail for customer movement and shopping behavior analysis
Google Cloud's role in providing AI tools and infrastructure
Business value of AI use cases and measurable outcomes
Advice for businesses on adopting AI: Experiment, iterate, and invest in foundational technology
The transformative potential of generative AI across industries
Transcripts
foreign
[Music]
let's get started hey everyone uh
welcome to this session on common
business use cases for general of AI I
am Nema dakiniko I'm a product manager
at Google for our gender of AI portfolio
but I will turn it over to our esteemed
guests here who can introduce themselves
because we do have a very packed panel
and we're going to start with you hi
everybody I'm Ignacio Garcia I'm the
global director of data analytics and AI
for both of them and I'm also the CIO of
Vodafone Italy
everyone arvind Christian here I had
engineering at Blue Core so I run
engineering data science and our
solution architecture teams
hi I'm Donna I lead the Technical
Solutions management team for generative
AI at Google cloud and together with the
solution architecture teams with Kevin's
team we identify design and build AI
Solutions
foreign
Solutions and we also do ml
infrastructure as well
very cool uh okay so let's start first
by talking about top business use cases
and Don I'm going to start with you on
this one but I really want to understand
many people here are like look Jenai
fantastic this is great I'm sold but
what are those top use cases well how do
you prioritize them how to get started
sure yeah so um I can start with
prioritization and then go into a few
use cases so we really in our
prioritization we focus on our customers
what will really add value to them where
are they seeing friction points and then
the Techno technical implementation of
that we also look internally to some of
our research teams and think about how
can we take those Innovations and then
bring those to our customers so one
example that we worked on around one and
a half years ago
was Alpha fold on Google Cloud so
deepmind had this amazing research and
maybe to give a little bit of context on
the protein folding problem scientists
have long been interested in solving the
protein folding problem because once a
protein's structure within a cell is
understood then scientists are able to
develop drugs that can modulate its
function
but for our Healthcare organizations in
order to be able to actually leverage
that they need additional requirements
so for example reproducibility
scalability it has to be cost effective
um and so we took their their amazing
research and we operationalized it on
Google Cloud through a solution
some other areas more recent where we're
seeing traction or for example product
cataloging so being able to categorize
when there's a new product label it for
search and then also create the website
copy and arvind and his team have done
some amazing work in this space and
customer service operations and it's not
just a conversational agent which you
may have experienced on a website that's
answering questions but also internally
for example supporting SRE teams with
post-mortem search and summarization
which we've done some work on but also
supporting support agents with
summarization and next steps and Ignacio
and his team have done some some great
work here
what are some of the use cases and
prioritizations that your organization
does yeah
um before I jump into a use case maybe
just a little bit step back and what we
blew core do so we can connect the dots
so we are an identification and a
customer Movement platform so we work
with large Enterprise retailers to
identify and then convert Shoppers to
uh repeat customers
so we've used traditional Ai and created
over 20 retail models using first party
data so Shopper information
behavioral data and then product
information
so these models are baked into our
platform so marketer can use these
models to create campaign campaigns and
audiences
so the content that needs to be
generated
the channels in which to deliver these
uh the the content and finally the
timing when to deliver are all
personalized on a per Shopper basis
with the Advent of gen AI we looked at a
couple of areas you know one is how can
we improve uh our features on our
platform we looked at internal
efficiencies as well and option
opportunities to better serve our
customers
so um the the problem that she was
referring to uh is core to our value
proposition so taking unstructured
uh product catalog data and mapping it
to Google's product taxonomy
so for example
um a retailer could call this a t-shirt
another retailer could call it a
categorize it as a true cut T-shirt and
so on and so forth but if you
standardize it in the Google's taxonomy
this is probably labeled as a t-shirt
which is under categorized under a shirt
which is apparel and so on so there are
a number of uh advantages to
standardizing product catalog one is we
will improve our models and our wrecks
our customers will be able to now
analyze performance within their product
catalog and finally we will be able to
deliver Trends within verticals across
the retail space
so using gen AI
we were not very successful using
traditional AI so with Gen AI we were
able to solve this specific problem very
cool absolutely I love the phone
hopefully people are a little bit more
familiar with it in terms of what you do
but love to understand your use cases
and prioritizations oh thank you I think
that I still will take a couple of
minutes just to explain the the
complexity that we have and that context
maybe help to understand how are we
using what we're using so we are a
telecommunication company we do mobiles
we do
televisions we do fixed lines and we do
iot so the whole package across the
world we have more than 300 million
customers we have billions of iot
devices so I'm just talking about the
scale and then it's in Europe and in
Asia and in other areas so languages are
completely diverse and this is another
point that is very important on the on
how we're using and the type of problems
that we need to resolve probably a bit
of background as well is we have been
very focused on partnership with Google
on cloud and data so Google is our
partner on the data domain and we have
been very focused on making sure that
our data is in the right place is safe
for our customers we have the
anonymization we have all the
regulations that we have in Europe
around privacy and we're super focused
on that and we have been using Ai and we
have been using Google tooling and for
many of our normal operations so if I go
it will be strong models analysis on
trying to understand why the customers
are going up next best offered models
that is on the customer side but then if
you go to the network we we do analysis
in where to deploy the network so capex
efficiency which is super important or
and we call it a predictive maintenance
so trying to understand what is going to
break and and be sure that we can
replace the components and and that was
successful but then with gnai we have
been now experimenting and and it's a
completely different dimension for
example the use case that you you are
saying with it implemented in Italy so
we are getting all the calls that our
customers are making to the call centers
We're translating them into text and
then we're getting a summarization of
the problems what was the original
intention to reduce the risk that the
customers are calling us and to drive
automatically the Deep detractors so the
amount us is the attractors this is only
possible now that we have large language
models available and we were able to do
it very fast because we have been very
consistent on creating the data
structures to combine combine the the
models with our data in a good way but
this use case in particular so we're
taking these 50 000 calls translating
them into text summarizing and getting
the the reason of the problems and it's
a complete Game Changer because then we
understand what the customers are really
saying we don't need to do surveys and
get high level data we're really getting
to the actual details on why are they
calling us and then we can intervene on
that and then that data that was a
regional reason and then that data has
become key to do other things to
understand behaviors and understand
potential upselling and other areas and
the other important thing on that is we
need to replicate that across all the
countries so replicability and scale is
fundamental it's not only doing and what
is a use case is can we do it fast and
with it secure and with it across a wall
in a in a way that we can repeat and we
will talk later about vertex Ai and
different components on Google have
allow us or is allowing us to us
absolutely it's interesting I think both
we actually spoke about data and like
taxonomies and data structures so this
goes into our second question around how
do you actually technically design these
Solutions and Kevin I'll start with you
um okay so Donna's team and my team have
worked on a number of different uh
business use cases applying to an AI and
uh what's interesting is to see a few
technical patterns kind of surface up or
just kind of permeates across the
different use cases right so the first
one is how do I get data that's outside
of the llm right into my application
right so that use case is very pertinent
to you know customer support right for
example so uh one of the very uh one of
the very common technical patterns is to
use
um like a embedding model to process
your unstructured data and then index it
with a vector database you know so
matching engine are now called a vector
search right yeah is a very popular
option and now we also have lodb with PG
Vector as well right so there are a lot
of options for that yeah and after you
index that then you can very quickly
retrieve your unstructured data images
and text and so forth the other type of
data received we're now seeing right is
to use a coding model like code to
generate the SQL right to action to
access your relational database or to
generate Cipher to access neo4j right so
that's that's kind of another kind of up
and coming type you know pattern that
we're seeing and another area that we're
seeing is around application of these uh
language or you know these Transformer
models to non-image non-text use cases
right so one of the partners that we're
working with full story we're helping
them build a sequence model to analyze
and predict user events right and
finally with the Donna mentioned Alpha
fault right well allothold actually is a
Transformer model it's very interesting
yes so instead of generating language or
image it generates protein structures so
we took deepminds research
we broke the inference pipeline which is
actually a multi-step pipeline right so
we applied uh different types of compute
to the different processes the earlier
parts has a lot of data retrieval so we
use high uh IOP CPU nodes for the later
compute stages which is extremely
compute intensive we use Nvidia a100
gpus right so that's how we can optimize
right going from research to production
to production we also need to make sure
that we're we we have reproducibility
right and experiment tracking and for
that we use vertex metadata right and to
kind of stitch it all together we use
vertex pipelines to automate the process
so adaptation is another very
interesting area that we're seeing a
more customer uh demands
very good you can also I'll go to you
first what are those technical
considerations that you have to have
when you're trying to deploy this around
the world essentially uh the first thing
and again is escape scalability and and
replicability of what we are doing
because we have to secure the data we
have a lot of local regulations uh about
gdprs in different countries have their
own flavors and we have to to create or
we have created policies around and
making sure that there are no bias in
the model oh making sure that we can
detect those buyers is very regulated
the world in Europe probably very
different to America but there we we
have to complain with a lot of things
and you can do these things what takes a
lot of times and by the time that you
deploy then your data signs are going to
kill themselves because they have in
spending 99 of their time in activities
that are not related to the model so
architecturally what we have done is
take out the problem of data transport
so making sure that their data arrived
to the right place is something that we
do and we do in with your and
Engineering that we have created for all
the data Logistics that allow us to
monitor to make sure that the data is
encrypted the data is anonymized and if
we change anything on the policies that
applies for all the the data pipelines
that we have across so that that has
been designed and then we have the
process inside which is a bigquery
standard and then in the top and here is
where vertex and and your product is
fundamental we we have in a very early
adopter so vertex AI we created
something that we call the AI booster
based on vertex AI these are our
adaptation but that is where we receive
the models and where we then exchange
and make sure that we can do what you
are saying which is running models with
your own data and data that is subside
so architecturally we have data
Logistics we have all the where we have
policies security encryption and
monitoring all the good stuff then we
have the engine that runs all the
queries which is our real Nerf systems
if you want to call it like this and
then in the top we have created with
vertex AI the interface to run all the
models and to co-create what is that
allowing us that now our data science
are now working on encryption data
engineering uh all the all the
bureaucracy and equally we can share
models across the market so the data
Engineering in Italy that have created
this model are now just passing the
information and the the guys in Germany
that are going to run it I can do it in
weeks rather than in months which was
um the previous setup so it's very very
important for us to take it
and spend that time on making sure that
we have the foundation right then in
parallel we allow a lot of
experimentation because very important
that the people can experiment and see
the power in a safe environment where
they can play and they see and they see
that the model is right but the
deployment is very automatic to
automatized and and it's very secure
that's excellent so it's a lot of
engineering behind to to allow us to run
properly Auburn can you double click
into the technical aspects of yours sure
um
so the team came up with a really
ingenious uh two-step process so on the
one hand we have thousands of product
catalog in our database and then the
Google product taxonomy has around
roughly around 5500 classes in
subclasses
so what what the team did was first use
uh gecko to create the embeddings and to
sort of reduce and narrow down the
options and then pass that along to
create a prompt and pass that through
text bison to create the final results
very cool very cool um okay so this is a
little bit of a self-serving question
but why did you choose Google cloud and
it can't be Donna and Kevin so so that's
not on the table but um for us was a
proper processor like five six years ago
and we were defining our Cloud strategy
first and and we did a very sort of
analysis and the three reasons I would
say is your header teaching data so you
you have that the The Innovation and
roadmap that you were proposing and the
approach and the ways of working and it
was very refreshing to see that it was a
relation on co-innovation and trying to
tackle problems together rather than
this is a price list and just consume
these products and services and we have
done that and in we have been very good
partners so far I always always have to
make sure so far
um so blue core is natively built on gcp
and so the team is very very comfortable
using the tools and technology and
Google has done a fairly good job
building the Gen AI Technologies
alongside the existing Technologies the
other things which you mentioned around
data governance security are important
to us and a lot of those are also baked
into the jnai technology
so I think that's the first the second
one is we actually started this project
on
GPT 3.5 and we got really good results
and then we elevated and moved on to
gpd4 and we got really good results as
well as well and then when Google
released Palm 2 we decided to try it out
and so far the results have definitely
been better than what we've seen with
open Eis models absolutely Okay so we've
talked about use cases we've talked
about like technical you know Solutions
but at the end of the day it's all about
that business value right like how do
you actually see those business results
Donna like I want to talk to you about
this like start with you
how do we see those business results and
can you give me some examples because
without them it's kind of pointless
right like you want to see those results
so I'm good
yeah um so let me start with Alpha fold
um so in the case of alpha folds the the
customers were able to conduct
experiments much faster get much quicker
insights and also minimize the high
ratio of failures from more traditional
methods and I mean the impact was really
incredible to see and it's also one of
the reasons why I'm in this field
because they were able to accelerate the
drug Discovery process both biotech and
pharmaceutical companies alike
um more generally I would say within the
generative AI space we see customers
getting benefits in terms of employee
productivity allowing them to focus on
more value-added tasks we see more
intuitive experiences being built with
generative Ai and also better user
experiences and then also new insights
or new ideas that were impossible before
but I think arvin's and Ignacio will
have great input here Armin we had two
overarching goals for this specific
project so the first one was obviously
the accuracy of the data and the results
and then
the cost of
building and maintaining this feature
and what we've found
was using llm and gen AI we achieved
both goals so sort of to step back we
attempted to solve this problem last
year using traditional AI techniques and
like I mentioned you know you're talking
of thousands of product catalogs we have
around 5000 Plus classes in the Google
taxonomy and trying to build a model
using that is extremely difficult and
now to scale that across multiple
verticals within the retail space makes
it even more challenging so for us
it was extremely expensive both from a
human time perspective
computational resources and you know the
amount of data we needed to train and
build those models and a lot of those
were solved with Jin AI absolutely
all right I think that I will I will
talk about three dimensions of benefits
first I will go to the use case and the
reason why I'm here which is that
summarization and the the understanding
on on what the customers are calling us
and then I will go to the three areas
where we are working on generative AI
because that's one use case but really I
mean we are going ahead in in many areas
and finally from technology point of
view the benefits about velocity and
cost of building because that is what
I'm responsible for in order to learn
more on the on the recent side so the
these particular use case which is the
summarization on all the calls and why
people is calling to our call centers is
very simple we were not able to do it
before so it's a paradigm change in the
past that was impossible to do
simple as that
um and now we can do it and that
information has changed completely our
understanding and our relation and
proximity with our customers because we
were doing surveys and getting scores
and getting some comments and then it
was a team trying to understand what
does that means and then we were cross
uh Crossing that data with technical
problems that we have and then making
interpretations and you you should see
the debates that we had in our customer
boards that we create to try to
understand and be better and now we have
three really really granular information
that is precise is what they are saying
it's not what we think that they're
saying so that is a paradigm change I
cannot put money on that but we are
trying to reduce 30 percent our lead
detractors and we are in in that track
so he's doing very well then if I go to
the second one which are the different
areas I would say we are working on all
the checkbox and interaction with the
customers it's early days we see an
incredible opportunity in there so we
have checkbox but we have to invest a
lot of money and time and training the
different languages and hence the
explanation about the fact that we we
operate in a multi-language is is
expensive it's not precise the
experience for the customer is not
correct and our early tests are showing
that this can be again a paradigm change
in comparison with what we were doing
before so one big area by the different
domains of customers operations all the
different call centers that we have is
is a chat box the second one is what we
call copilot so it's making sure that
the people can really do their job
better and that imply a lot of areas so
we're talking about
if you are in my team in the in the
technical team coding yeah and we see
productivity that's 10 to one in
comparison with no use in it I still
experimental I'm no I'm not talking
about
details but
legacies so we grow through mergers and
Acquisitions so we have a lot of
legacies with documentation that is not
existence and knowledge that is
disappearing of there
we are testing how to get that those
system documented and that change again
our ability to to run operations in a
different way or to maintain systems
that were not possible to maintain so we
are experimenting in what we call
copilot and you can add every different
business area and we are doing
experiments on that and then it's
Knowledge Management we're a massive
company we have a lot of information
managing knowledge is is a big problem
for us and doing it well can change
completely how our customers are
receiving our services and that is a
certain area so the the first benefit
was in the in the proper bit the second
is in in these three areas and finally
from me something like vertex Ai and all
the engineering is the velocity so the
fact that we were able to do in in weeks
the real application that is working and
for taking all the calls in a full
country is just because the tools are
there and the cost and the velocity of
the plane is is helping us to to really
get the value yeah
I really like this framing of again it's
it's what are you what are you changing
how much does it cost how effective is
it and then can you measure that right
because otherwise it's just a boondoggle
you're spending money and who knows
what's happening but then you can prove
it you can double down on it and build
on top of it absolutely Uh Kevin uh can
you talk a little bit about what you
learned along the way of this magical
genie Journey
um in short I learned that we have to
continue learning I mean there's just so
much to learn I mean think about where
we were six months ago right look at
where we are today you can imagine
what's like six months from now right so
it's a new normal The New Normal is
you're gonna really have to keep the
learning up
um but then the good news is
accessibility right imagine think about
a year ago how could you get your hands
on to llms right you have to spin up
your own VMS spin up there you know just
set it all up and download some
framework install it yourself before you
can even prompt it all right and you
probably have to troubleshoot a bunch of
libraries and stuff like that along the
way I mean we've gone through that so
yeah
um today it's available everywhere all
the cloud vendors have it we have cohere
we have anthropic it's there's really no
excuse to not have that hands-on
experience right so go for it you know
and there are all these you know if you
sign for free and try it out right so
definitely try it out and we have
Frameworks like Lang chain to help you
build up these applications as well
right so go Hands-On I think the last
thing also is we used to say read the
manual read the manual right I think you
know in this world we may still want to
want to start saying read the papers
read the papers papers as in research
papers right so you'll get a lot of
insight of what's going on right so this
a lot of this started with a Transformer
paper back in 2017. it's still a really
good paper to read right so there are
good papers
um if you don't have time read the
abstract all right but I think that's
where you could really get up to speed
on what's coming down the pipe I would
used to go to the Google AI blog and
just read all the papers they were
publishing because I was like listen if
Google's publishing it it's probably
state of the art so yeah that's that's
how I used to cheat sheet my way through
yeah exactly the papers or you could run
it through an llm to summarize get a
summarization right yeah
um
I think for a subset of problems like
the one that I mentioned I think llms
provide a phenomenal Foundation because
there is so much information encoded in
these models because there's craned on
so much relevant data it makes the lift
for us a lot easier right like I
mentioned we tried this with traditional
Ai and it was really not sustainable for
us to scale this and what uh gen AI is
now able to provide or the tools for us
to be to not only solve but able to
scale problems such as the one I
mentioned
for me three things one is is really
transformational everybody can see it
don't try to centralize it in Ito and in
any area let the people experiment so
make sure that you create a safe
environment for experimentation and you
can measure that to then decide where
you're going to invest but let the
people play because the people need to
play don't don't restrict in the other
hand investing in Foundation because it
is the only reason or the only way to
get real value is combining the LL
models with your own data so investing
foundation invests on getting the right
architecture because then you can go
super fast when the experiment shows
that these are use case is worth 2x
scale so that that is my my learning
thing and sorry one more stick with your
CEO make sure that they they understand
the potential the value and they are no
um bombarded by vendors telling them
that is a
bullet point a silver bullet to resolve
every single problem so invest in
education in based on bringing the the
whole company and do not stop do not
centralize it because then you kill it I
I've heard so many customers say the
board of directors the CEO have told me
we need to do this ASAP and so yeah it's
in every single boardroom every single
you know CEO office
um okay one last question and then we'll
open it up to q a so you know get ready
there are mics here in case you want to
ask questions but last question advice
any other pieces of nuggets of wisdom
that you have for the audience here in
terms of business use cases
um sure yeah so I would say and I think
we've all touched on this already but is
experiment and iterate so generative AI
truly has the potential to transform
each industry we're living in a very
exciting time but nothing really
substitutes hands-on experience right to
understand what the value is
specifically for your business and also
to understand the limitations so I would
look within an organization and
customers that we've seen be successful
do this is really look at who's excited
to Pioneer this which domain experts and
which machine learning experts get them
together to identify what does success
look like to work on the use case and we
did this as well so we actually worked
with an SRE team to on the postmortem
search and summarization use case where
we worked with experts that really
helped Define what success looks like if
our outputs are good we started with a
very small set of experts we gradually
expanded to a trusted tester group we
started with a small data set of 100
postmortems we expanded to a thousand
postmortems and now we're rolling this
out more broadly and so um that that
would be my advice is just get started
um
so I I consider uh gen AI or llm as
being yet another tool in your toolbox
right so as you go through building
features I'm sure you have some success
metrics and if Jin AI is going to help
you reach those metrics and those goals
then yes that's the right tool right so
that's the first thing and given as you
mentioned Kevin like how rapidly things
are changing uh pick use cases that has
a tolerance for getting it wrong so for
example pick use cases that are internal
facing or pick use cases that have a
human in the middle so you can Rectify
if there are issues uh the last one I
would say is
invest in tooling and technology and
platform what I mean by that is today we
run 20 plus models we have customers
asking us questions about the
performance the output of the models we
uh our teams have to debug fix issues so
we've built a lot of tooling to be able
to observe and scale the systems so I
keep him for uh reminding my team if
you're gonna build a ship be prepared
for a shipwreck right so while you're
yes focused on llm engine AI ensure you
have enough tooling and platforms to
support that
I'm sorry nothing else to add I think is
I I already mentioned everybody is if
that is precisely that I think my piece
of advice is if anyone tells you what
they know what channel is going to be
doing in five years they're lying to you
so you know stay humble stay stay
attached to the actual you know
short-term use cases because it's I'm
sure it's gonna be crazy in five years
but but we're we're to your point we're
all just like wait what did we announce
so absolutely all right uh we are out of
time uh love it if you uh you know give
us feedback here and I think we'll be
around if you have any other questions
as well thank you everyone
foreign
Browse More Related Video
![](https://i.ytimg.com/vi/Ivfi43pERIY/hq720.jpg)
How to become a staff+ engineer
![](https://i.ytimg.com/vi/rtQfoUSt1o4/hq720.jpg)
ใๆพๅฐพ่ฑใๅคงๆณจ็ฎใ็ๆAIใงใใฏใคใใซใฉใผใฎไปไบใๆฟๅค๏ผใใฒใใใใไปฐๅคฉใ
![](https://i.ytimg.com/vi/zEWNJ5MJNBM/hq720.jpg)
Unlocking Value with AI | WSO2Con USA 2024
![](https://i.ytimg.com/vi/kjM24I4JpyQ/hq720.jpg?v=662a79c6)
GEF Madrid 2024: AI, Enhancing Digital Safety in Schools?
![](https://i.ytimg.com/vi/ZgVsv3DUdgY/hqdefault.jpg?sqp=-oaymwEXCJADEOABSFryq4qpAwkIARUAAIhCGAE=&rs=AOn4CLBKOl3p_ESJSN4Kv45jU_DFilfOBw)
ChatGPT | What is ChatGPT | ChatGPT in 6 Minutes | Introduction to ChatGPT | ChatGPT Simplified
![](https://i.ytimg.com/vi/1fwsbNwU_ts/hq720.jpg?v=65fb45cc)
AI Generativa
5.0 / 5 (0 votes)