Common business use cases for generative AI

Google Cloud
11 Dec 202332:43

Summary

TLDRIn a panel discussion, industry experts from Google, Vodafone, and Blue Core shared insights on the application of generative AI in business. They highlighted the importance of prioritizing customer value and technical feasibility in AI projects. Notable use cases included Alpha Fold's impact on drug discovery, product cataloging, and customer service operations. The panel emphasized the transformative potential of AI, the need for continuous learning, and the value of hands-on experience in understanding its practical applications and limitations.

Takeaways

  • ๐ŸŒŸ Prioritization of AI use cases focuses on customer value, friction points, technical feasibility, and aligning with internal research innovations.
  • ๐Ÿ”ฌ AlphaFold on Google Cloud is a prime example of AI research translated into a practical solution, accelerating drug discovery by understanding protein structures.
  • ๐Ÿ›๏ธ Product cataloging is a significant use case where AI helps in categorizing products for search and creating website copy, improving customer experience and retail performance.
  • ๐Ÿ’ฌ Customer service operations benefit from AI through conversational agents and internal support for SRE teams, enhancing post-mortem search and summarization.
  • ๐Ÿ“Š Generative AI is utilized to improve features on platforms, increase internal efficiencies, and better serve customers by standardizing product data and mapping it to taxonomies like Google's.
  • ๐ŸŒ Telecommunication companies leverage AI for network deployment, predictive maintenance, and customer call analysis, translating and summarizing calls to understand customer issues and improve service.
  • ๐Ÿ”„ Technical solutions in AI involve embedding models, vector databases, and vector search engines to process and retrieve unstructured data efficiently.
  • ๐Ÿš€ The application of Transformer models extends beyond text and images to non-traditional use cases, such as analyzing and predicting user events in software applications.
  • ๐ŸŒ Global deployment of AI solutions requires consideration of scalability, replicability, data security, and compliance with local regulations like GDPR.
  • ๐Ÿ“ˆ Business value from AI is measured in terms of increased employee productivity, improved user experiences, and the generation of new insights or ideas that were previously impossible.
  • ๐Ÿ’ก The importance of continuous learning, hands-on experience, and staying up-to-date with the latest AI research and technologies is emphasized for maximizing the potential of generative AI.

Q & A

  • What is the main focus of the session on common business use cases for general AI?

    -The session focuses on discussing and understanding the top business use cases for general AI, how to prioritize them, and how to get started with their implementation in various industries.

  • Who are the panelists in the session and what are their roles?

    -The panelists include Nema Dakiniko, a product manager at Google for AI portfolio, Ignacio Garcia, the global director of data analytics and AI and CIO of Vodafone Italy, Arvind Christian, who runs engineering, data science, and solution architecture teams at Blue Core, and Donna, who leads the Technical Solutions management team for generative AI at Google Cloud.

  • How does Google prioritize its AI solutions for customers?

    -Google prioritizes AI solutions by focusing on what will add value to the customers, identifying friction points they face, and considering the technical implementation of the solutions. They also look internally to their research teams to bring innovations to customers.

  • Can you explain the significance of AlphaFold on Google Cloud?

    -AlphaFold is a significant research project by DeepMind that focuses on solving the protein folding problem. It's important because understanding a protein's structure can lead to the development of drugs that modulate its function. Google Cloud has operationalized this research into a solution that adds value to healthcare organizations by making it reproducible, scalable, and cost-effective.

  • What are some of the technical patterns observed in different AI use cases?

    -Technical patterns include using embedding models to process unstructured data and index it with a vector database, using coding models like Codex to generate SQL or Cypher for database access, and applying Transformer models to non-image, non-text use cases.

  • How does Vodafone Italy utilize AI in its operations?

    -Vodafone Italy uses AI for strong model analysis to understand customer behavior and offer models, network deployment for capex efficiency, predictive maintenance, and summarization of customer call data to understand and improve customer service and reduce detractors.

  • What are the key technical considerations when deploying AI solutions globally?

    -Key considerations include scalability and replicability, ensuring data security, adhering to local regulations like GDPR, avoiding bias in models, and creating policies to maintain these standards.

  • Why did Blue Core choose Google Cloud for its AI projects?

    -Blue Core chose Google Cloud due to its innovation roadmap, co-innovation approach, data governance and security features, and the performance they observed with Google's AI technologies compared to other models.

  • What are the business benefits of using generative AI?

    -Business benefits include increased employee productivity, more intuitive and better user experiences, new insights or ideas that were impossible before, and cost-effectiveness in problem-solving and scaling solutions.

  • What advice does the panel have for businesses looking to implement generative AI?

    -The advice includes experimenting and iterating with AI, investing in foundational architecture, creating a safe environment for experimentation, not restricting innovation, and educating the entire company about the potential of AI.

  • How does the panel suggest businesses should approach the rapid evolution of AI technologies?

    -Businesses should stay humble, focus on short-term use cases, invest in tooling and platforms to support AI, and ensure that their AI initiatives are not centralized to avoid stifling innovation.

Outlines

00:00

๐Ÿค Introductions and Panel Discussion on AI Use Cases

The panel discussion begins with introductions of the speakers: Nema Dakiniko, a product manager at Google AI, Ignacio Garcia, the global director of data analytics and AI at Vodafone Italy, Arvind Christian, head of engineering at Blue Core, and Donna, who leads the Technical Solutions management team for generative AI at Google Cloud. The panel aims to discuss common business use cases for AI, with a focus on how to prioritize and implement AI solutions effectively. The conversation starts with Donna explaining the prioritization process based on customer value and technical feasibility, and then moves on to specific examples like Alpha fold on Google Cloud and its impact on drug discovery.

05:00

๐Ÿ“ˆ Product Cataloging and Customer Service Operations

The discussion continues with the application of generative AI in product cataloging and customer service operations. Arvind and his team have worked on categorizing new product labels for search, improving website content, and supporting SRE teams with post-mortem search and summarization. Ignacio shares Vodafone Italy's experience with AI in customer service, emphasizing the importance of understanding customer needs and improving interactions through AI. The conversation highlights the benefits of using AI to standardize product data and the technical considerations for deploying AI solutions globally.

10:01

๐Ÿ› ๏ธ Technical Design and Implementation of AI Solutions

Kevin and Donna discuss the technical aspects of designing and implementing AI solutions. They cover patterns like using embedding models and vector databases for unstructured data, code generation models for database access, and applying Transformer models to non-text use cases. The conversation also touches on the optimization of the inference pipeline for Alpha fold, the importance of reproducibility and experiment tracking, and the use of Vertex AI for automating processes and ensuring security.

15:02

๐ŸŒ Global Deployment and Business Value

The panelists delve into the challenges and strategies of deploying AI solutions worldwide. They discuss the need for scalability, replicability, and compliance with local regulations like GDPR. The conversation highlights the importance of data logistics, policies, and the use of Vertex AI for secure and efficient model deployment. The panelists also share their experiences with the business value of AI, including improved accuracy, cost-effectiveness, and the ability to understand customer needs better.

20:05

๐Ÿš€ Insights and Learnings from the Journey with Generative AI

The panel concludes with insights and advice on working with generative AI. Kevin shares his learnings about the importance of continuous learning, accessibility of AI tools, and the value of research papers. He emphasizes the need for a safe environment for experimentation and the transformative potential of AI. Donna advises on experimenting, iterating, and investing in foundational technologies to combine AI models with proprietary data. The panelists stress the importance of staying humble and focused on short-term use cases while preparing for the rapid evolution of AI technology.

Mindmap

Keywords

๐Ÿ’กAI

Artificial Intelligence (AI) refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. In the context of the video, AI is the central theme, with discussions on its business applications, technical implementations, and the impact on various industries.

๐Ÿ’กProduct Manager

A Product Manager is a professional who is responsible for guiding the development of a product, from its conception to its launch and ongoing improvement. They often work closely with cross-functional teams and are key decision-makers in product development. In the video, Nema Dakiniko is introduced as a Product Manager at Google, highlighting the role of product management in AI technology development.

๐Ÿ’กData Analytics

Data Analytics is the process of examining data sets to draw conclusions about the information they contain. It involves the use of statistical tools, data mining, and predictive modeling to analyze data and make informed decisions. In the video, Ignacio Garcia discusses his role as the global director of data analytics and AI, emphasizing the importance of data analytics in driving AI solutions.

๐Ÿ’กGenerative AI

Generative AI refers to AI systems that are designed to create new content, such as text, images, or music, based on patterns learned from existing data. It is a subset of AI that focuses on the creation of new outputs rather than just analyzing or recognizing patterns. The video discusses the use of generative AI in various business contexts, such as customer service and product cataloging.

๐Ÿ’กML Infrastructure

Machine Learning (ML) Infrastructure refers to the underlying technology and systems that support the development, deployment, and maintenance of machine learning models. This includes hardware, software, and frameworks that enable data scientists and engineers to build and scale ML applications. In the video, the panelists discuss the importance of ML infrastructure in operationalizing AI research and bringing it to customers.

๐Ÿ’กCustomer Service Operations

Customer Service Operations involve the processes and strategies that organizations use to manage and improve their interactions with customers. This includes handling inquiries, resolving issues, and providing support to ensure customer satisfaction. In the context of the video, AI is being used to enhance customer service operations by automating responses and providing insights into customer needs.

๐Ÿ’กTechnical Solutions

Technical Solutions refer to the application of technology to solve specific problems or meet particular needs. In the business context, this often involves the development and implementation of systems, tools, or processes that address challenges faced by an organization. The video discusses how technical solutions, particularly those involving AI, are identified and built to add value for customers.

๐Ÿ’กData Structures

Data Structures are specialized formats for organizing, storing, and managing data in a computer's memory. They are crucial for efficient data access and modification and can significantly impact the performance of a computer program. In the context of the video, data structures are important for combining with AI models to create effective solutions, such as mapping product catalog data to standardized taxonomies.

๐Ÿ’กReplicability

Replicability refers to the ability to produce consistent results or outcomes when the same process is repeated under the same conditions. In scientific research and technology, replicability is essential for verifying the effectiveness of a method or solution. In the video, the panelists emphasize the importance of creating AI solutions that are not only effective but also replicable across different regions and languages.

๐Ÿ’กGoogle Cloud

Google Cloud is a suite of cloud computing services offered by Google, which includes Google Compute Engine, Google Cloud Storage, and Google App Engine. It provides businesses with a range of tools and services to build, deploy, and scale applications, store data, and process large amounts of information. In the video, Google Cloud is the platform where AI research like Alpha fold is operationalized and made available to customers.

Highlights

Session on common business use cases for general AI

Nema dakiniko, a product manager at Google, introduces the session

Ignacio Garcia, global director of data analytics and AI, discusses Vodafone's AI initiatives

Arvind Christian shares Blue Core's work in engineering, data science, and solution architecture

Donna leads the Technical Solutions management team for generative AI at Google Cloud

Prioritization of AI use cases based on customer value and technical feasibility

Alpha fold on Google Cloud as a notable AI research breakthrough

Product cataloging as a key business use case for AI

Customer service operations improved with AI conversational agents and internal support

Vodafone Italy's use of AI for call center call summarization

Technical design patterns for different AI use cases

Importance of data structures and taxonomies in AI solutions

Global deployment of AI solutions with considerations for scalability and replicability

Use of generative AI in retail for customer movement and shopping behavior analysis

Google Cloud's role in providing AI tools and infrastructure

Business value of AI use cases and measurable outcomes

Advice for businesses on adopting AI: Experiment, iterate, and invest in foundational technology

The transformative potential of generative AI across industries

Transcripts

play00:00

foreign

play00:02

[Music]

play00:12

let's get started hey everyone uh

play00:15

welcome to this session on common

play00:17

business use cases for general of AI I

play00:19

am Nema dakiniko I'm a product manager

play00:21

at Google for our gender of AI portfolio

play00:24

but I will turn it over to our esteemed

play00:28

guests here who can introduce themselves

play00:30

because we do have a very packed panel

play00:31

and we're going to start with you hi

play00:33

everybody I'm Ignacio Garcia I'm the

play00:36

global director of data analytics and AI

play00:39

for both of them and I'm also the CIO of

play00:43

Vodafone Italy

play00:45

everyone arvind Christian here I had

play00:48

engineering at Blue Core so I run

play00:50

engineering data science and our

play00:53

solution architecture teams

play00:56

hi I'm Donna I lead the Technical

play00:58

Solutions management team for generative

play01:00

AI at Google cloud and together with the

play01:03

solution architecture teams with Kevin's

play01:05

team we identify design and build AI

play01:09

Solutions

play01:10

foreign

play01:17

Solutions and we also do ml

play01:19

infrastructure as well

play01:21

very cool uh okay so let's start first

play01:25

by talking about top business use cases

play01:27

and Don I'm going to start with you on

play01:30

this one but I really want to understand

play01:31

many people here are like look Jenai

play01:34

fantastic this is great I'm sold but

play01:36

what are those top use cases well how do

play01:38

you prioritize them how to get started

play01:41

sure yeah so um I can start with

play01:43

prioritization and then go into a few

play01:45

use cases so we really in our

play01:48

prioritization we focus on our customers

play01:50

what will really add value to them where

play01:53

are they seeing friction points and then

play01:55

the Techno technical implementation of

play01:57

that we also look internally to some of

play02:00

our research teams and think about how

play02:03

can we take those Innovations and then

play02:05

bring those to our customers so one

play02:08

example that we worked on around one and

play02:10

a half years ago

play02:11

was Alpha fold on Google Cloud so

play02:14

deepmind had this amazing research and

play02:16

maybe to give a little bit of context on

play02:18

the protein folding problem scientists

play02:21

have long been interested in solving the

play02:23

protein folding problem because once a

play02:26

protein's structure within a cell is

play02:29

understood then scientists are able to

play02:31

develop drugs that can modulate its

play02:34

function

play02:34

but for our Healthcare organizations in

play02:37

order to be able to actually leverage

play02:39

that they need additional requirements

play02:41

so for example reproducibility

play02:44

scalability it has to be cost effective

play02:48

um and so we took their their amazing

play02:51

research and we operationalized it on

play02:53

Google Cloud through a solution

play02:54

some other areas more recent where we're

play02:57

seeing traction or for example product

play02:59

cataloging so being able to categorize

play03:01

when there's a new product label it for

play03:04

search and then also create the website

play03:06

copy and arvind and his team have done

play03:09

some amazing work in this space and

play03:11

customer service operations and it's not

play03:13

just a conversational agent which you

play03:16

may have experienced on a website that's

play03:18

answering questions but also internally

play03:21

for example supporting SRE teams with

play03:23

post-mortem search and summarization

play03:25

which we've done some work on but also

play03:28

supporting support agents with

play03:30

summarization and next steps and Ignacio

play03:33

and his team have done some some great

play03:34

work here

play03:38

what are some of the use cases and

play03:40

prioritizations that your organization

play03:41

does yeah

play03:43

um before I jump into a use case maybe

play03:45

just a little bit step back and what we

play03:48

blew core do so we can connect the dots

play03:50

so we are an identification and a

play03:53

customer Movement platform so we work

play03:56

with large Enterprise retailers to

play04:00

identify and then convert Shoppers to

play04:05

uh repeat customers

play04:08

so we've used traditional Ai and created

play04:12

over 20 retail models using first party

play04:16

data so Shopper information

play04:20

behavioral data and then product

play04:24

information

play04:25

so these models are baked into our

play04:28

platform so marketer can use these

play04:31

models to create campaign campaigns and

play04:33

audiences

play04:35

so the content that needs to be

play04:37

generated

play04:38

the channels in which to deliver these

play04:41

uh the the content and finally the

play04:44

timing when to deliver are all

play04:46

personalized on a per Shopper basis

play04:49

with the Advent of gen AI we looked at a

play04:52

couple of areas you know one is how can

play04:54

we improve uh our features on our

play04:58

platform we looked at internal

play05:00

efficiencies as well and option

play05:02

opportunities to better serve our

play05:04

customers

play05:05

so um the the problem that she was

play05:10

referring to uh is core to our value

play05:13

proposition so taking unstructured

play05:17

uh product catalog data and mapping it

play05:20

to Google's product taxonomy

play05:23

so for example

play05:25

um a retailer could call this a t-shirt

play05:28

another retailer could call it a

play05:30

categorize it as a true cut T-shirt and

play05:34

so on and so forth but if you

play05:36

standardize it in the Google's taxonomy

play05:39

this is probably labeled as a t-shirt

play05:42

which is under categorized under a shirt

play05:45

which is apparel and so on so there are

play05:47

a number of uh advantages to

play05:50

standardizing product catalog one is we

play05:54

will improve our models and our wrecks

play05:56

our customers will be able to now

play05:59

analyze performance within their product

play06:01

catalog and finally we will be able to

play06:04

deliver Trends within verticals across

play06:08

the retail space

play06:10

so using gen AI

play06:12

we were not very successful using

play06:16

traditional AI so with Gen AI we were

play06:19

able to solve this specific problem very

play06:21

cool absolutely I love the phone

play06:23

hopefully people are a little bit more

play06:24

familiar with it in terms of what you do

play06:25

but love to understand your use cases

play06:27

and prioritizations oh thank you I think

play06:29

that I still will take a couple of

play06:31

minutes just to explain the the

play06:33

complexity that we have and that context

play06:35

maybe help to understand how are we

play06:38

using what we're using so we are a

play06:40

telecommunication company we do mobiles

play06:42

we do

play06:44

televisions we do fixed lines and we do

play06:47

iot so the whole package across the

play06:50

world we have more than 300 million

play06:52

customers we have billions of iot

play06:55

devices so I'm just talking about the

play06:58

scale and then it's in Europe and in

play07:00

Asia and in other areas so languages are

play07:02

completely diverse and this is another

play07:04

point that is very important on the on

play07:05

how we're using and the type of problems

play07:07

that we need to resolve probably a bit

play07:10

of background as well is we have been

play07:12

very focused on partnership with Google

play07:15

on cloud and data so Google is our

play07:18

partner on the data domain and we have

play07:21

been very focused on making sure that

play07:24

our data is in the right place is safe

play07:26

for our customers we have the

play07:27

anonymization we have all the

play07:29

regulations that we have in Europe

play07:30

around privacy and we're super focused

play07:33

on that and we have been using Ai and we

play07:36

have been using Google tooling and for

play07:39

many of our normal operations so if I go

play07:43

it will be strong models analysis on

play07:45

trying to understand why the customers

play07:47

are going up next best offered models

play07:50

that is on the customer side but then if

play07:52

you go to the network we we do analysis

play07:55

in where to deploy the network so capex

play07:58

efficiency which is super important or

play08:01

and we call it a predictive maintenance

play08:04

so trying to understand what is going to

play08:06

break and and be sure that we can

play08:08

replace the components and and that was

play08:10

successful but then with gnai we have

play08:13

been now experimenting and and it's a

play08:15

completely different dimension for

play08:16

example the use case that you you are

play08:18

saying with it implemented in Italy so

play08:21

we are getting all the calls that our

play08:24

customers are making to the call centers

play08:25

We're translating them into text and

play08:29

then we're getting a summarization of

play08:30

the problems what was the original

play08:32

intention to reduce the risk that the

play08:34

customers are calling us and to drive

play08:36

automatically the Deep detractors so the

play08:38

amount us is the attractors this is only

play08:40

possible now that we have large language

play08:44

models available and we were able to do

play08:47

it very fast because we have been very

play08:49

consistent on creating the data

play08:51

structures to combine combine the the

play08:53

models with our data in a good way but

play08:56

this use case in particular so we're

play08:57

taking these 50 000 calls translating

play09:00

them into text summarizing and getting

play09:03

the the reason of the problems and it's

play09:05

a complete Game Changer because then we

play09:07

understand what the customers are really

play09:08

saying we don't need to do surveys and

play09:10

get high level data we're really getting

play09:12

to the actual details on why are they

play09:15

calling us and then we can intervene on

play09:17

that and then that data that was a

play09:19

regional reason and then that data has

play09:21

become key to do other things to

play09:23

understand behaviors and understand

play09:24

potential upselling and other areas and

play09:29

the other important thing on that is we

play09:32

need to replicate that across all the

play09:34

countries so replicability and scale is

play09:37

fundamental it's not only doing and what

play09:39

is a use case is can we do it fast and

play09:42

with it secure and with it across a wall

play09:44

in a in a way that we can repeat and we

play09:46

will talk later about vertex Ai and

play09:48

different components on Google have

play09:50

allow us or is allowing us to us

play09:53

absolutely it's interesting I think both

play09:56

we actually spoke about data and like

play09:58

taxonomies and data structures so this

play10:00

goes into our second question around how

play10:02

do you actually technically design these

play10:04

Solutions and Kevin I'll start with you

play10:07

um okay so Donna's team and my team have

play10:10

worked on a number of different uh

play10:11

business use cases applying to an AI and

play10:14

uh what's interesting is to see a few

play10:16

technical patterns kind of surface up or

play10:18

just kind of permeates across the

play10:20

different use cases right so the first

play10:22

one is how do I get data that's outside

play10:24

of the llm right into my application

play10:27

right so that use case is very pertinent

play10:30

to you know customer support right for

play10:32

example so uh one of the very uh one of

play10:36

the very common technical patterns is to

play10:37

use

play10:39

um like a embedding model to process

play10:41

your unstructured data and then index it

play10:44

with a vector database you know so

play10:46

matching engine are now called a vector

play10:48

search right yeah is a very popular

play10:50

option and now we also have lodb with PG

play10:53

Vector as well right so there are a lot

play10:54

of options for that yeah and after you

play10:56

index that then you can very quickly

play10:58

retrieve your unstructured data images

play10:59

and text and so forth the other type of

play11:02

data received we're now seeing right is

play11:05

to use a coding model like code to

play11:08

generate the SQL right to action to

play11:10

access your relational database or to

play11:12

generate Cipher to access neo4j right so

play11:15

that's that's kind of another kind of up

play11:17

and coming type you know pattern that

play11:18

we're seeing and another area that we're

play11:21

seeing is around application of these uh

play11:24

language or you know these Transformer

play11:25

models to non-image non-text use cases

play11:29

right so one of the partners that we're

play11:31

working with full story we're helping

play11:33

them build a sequence model to analyze

play11:36

and predict user events right and

play11:39

finally with the Donna mentioned Alpha

play11:42

fault right well allothold actually is a

play11:44

Transformer model it's very interesting

play11:46

yes so instead of generating language or

play11:49

image it generates protein structures so

play11:52

we took deepminds research

play11:54

we broke the inference pipeline which is

play11:57

actually a multi-step pipeline right so

play11:59

we applied uh different types of compute

play12:02

to the different processes the earlier

play12:04

parts has a lot of data retrieval so we

play12:07

use high uh IOP CPU nodes for the later

play12:12

compute stages which is extremely

play12:14

compute intensive we use Nvidia a100

play12:16

gpus right so that's how we can optimize

play12:19

right going from research to production

play12:23

to production we also need to make sure

play12:25

that we're we we have reproducibility

play12:27

right and experiment tracking and for

play12:29

that we use vertex metadata right and to

play12:31

kind of stitch it all together we use

play12:33

vertex pipelines to automate the process

play12:36

so adaptation is another very

play12:39

interesting area that we're seeing a

play12:41

more customer uh demands

play12:44

very good you can also I'll go to you

play12:46

first what are those technical

play12:48

considerations that you have to have

play12:49

when you're trying to deploy this around

play12:51

the world essentially uh the first thing

play12:54

and again is escape scalability and and

play12:58

replicability of what we are doing

play13:00

because we have to secure the data we

play13:03

have a lot of local regulations uh about

play13:06

gdprs in different countries have their

play13:09

own flavors and we have to to create or

play13:12

we have created policies around and

play13:15

making sure that there are no bias in

play13:17

the model oh making sure that we can

play13:19

detect those buyers is very regulated

play13:21

the world in Europe probably very

play13:24

different to America but there we we

play13:26

have to complain with a lot of things

play13:27

and you can do these things what takes a

play13:29

lot of times and by the time that you

play13:31

deploy then your data signs are going to

play13:34

kill themselves because they have in

play13:35

spending 99 of their time in activities

play13:38

that are not related to the model so

play13:41

architecturally what we have done is

play13:43

take out the problem of data transport

play13:46

so making sure that their data arrived

play13:48

to the right place is something that we

play13:49

do and we do in with your and

play13:52

Engineering that we have created for all

play13:53

the data Logistics that allow us to

play13:55

monitor to make sure that the data is

play13:58

encrypted the data is anonymized and if

play14:00

we change anything on the policies that

play14:02

applies for all the the data pipelines

play14:03

that we have across so that that has

play14:05

been designed and then we have the

play14:08

process inside which is a bigquery

play14:11

standard and then in the top and here is

play14:13

where vertex and and your product is

play14:15

fundamental we we have in a very early

play14:17

adopter so vertex AI we created

play14:19

something that we call the AI booster

play14:20

based on vertex AI these are our

play14:22

adaptation but that is where we receive

play14:25

the models and where we then exchange

play14:27

and make sure that we can do what you

play14:28

are saying which is running models with

play14:30

your own data and data that is subside

play14:33

so architecturally we have data

play14:35

Logistics we have all the where we have

play14:37

policies security encryption and

play14:40

monitoring all the good stuff then we

play14:43

have the engine that runs all the

play14:46

queries which is our real Nerf systems

play14:49

if you want to call it like this and

play14:51

then in the top we have created with

play14:52

vertex AI the interface to run all the

play14:55

models and to co-create what is that

play14:56

allowing us that now our data science

play14:58

are now working on encryption data

play15:02

engineering uh all the all the

play15:04

bureaucracy and equally we can share

play15:06

models across the market so the data

play15:08

Engineering in Italy that have created

play15:10

this model are now just passing the

play15:13

information and the the guys in Germany

play15:16

that are going to run it I can do it in

play15:17

weeks rather than in months which was

play15:21

um the previous setup so it's very very

play15:23

important for us to take it

play15:25

and spend that time on making sure that

play15:28

we have the foundation right then in

play15:29

parallel we allow a lot of

play15:31

experimentation because very important

play15:32

that the people can experiment and see

play15:34

the power in a safe environment where

play15:36

they can play and they see and they see

play15:38

that the model is right but the

play15:39

deployment is very automatic to

play15:41

automatized and and it's very secure

play15:43

that's excellent so it's a lot of

play15:45

engineering behind to to allow us to run

play15:47

properly Auburn can you double click

play15:49

into the technical aspects of yours sure

play15:51

um

play15:52

so the team came up with a really

play15:54

ingenious uh two-step process so on the

play15:56

one hand we have thousands of product

play15:59

catalog in our database and then the

play16:02

Google product taxonomy has around

play16:04

roughly around 5500 classes in

play16:07

subclasses

play16:09

so what what the team did was first use

play16:12

uh gecko to create the embeddings and to

play16:14

sort of reduce and narrow down the

play16:17

options and then pass that along to

play16:19

create a prompt and pass that through

play16:22

text bison to create the final results

play16:26

very cool very cool um okay so this is a

play16:29

little bit of a self-serving question

play16:30

but why did you choose Google cloud and

play16:33

it can't be Donna and Kevin so so that's

play16:35

not on the table but um for us was a

play16:37

proper processor like five six years ago

play16:40

and we were defining our Cloud strategy

play16:43

first and and we did a very sort of

play16:47

analysis and the three reasons I would

play16:49

say is your header teaching data so you

play16:53

you have that the The Innovation and

play16:58

roadmap that you were proposing and the

play17:01

approach and the ways of working and it

play17:03

was very refreshing to see that it was a

play17:05

relation on co-innovation and trying to

play17:07

tackle problems together rather than

play17:10

this is a price list and just consume

play17:12

these products and services and we have

play17:14

done that and in we have been very good

play17:16

partners so far I always always have to

play17:19

make sure so far

play17:22

um so blue core is natively built on gcp

play17:24

and so the team is very very comfortable

play17:26

using the tools and technology and

play17:29

Google has done a fairly good job

play17:31

building the Gen AI Technologies

play17:33

alongside the existing Technologies the

play17:37

other things which you mentioned around

play17:38

data governance security are important

play17:41

to us and a lot of those are also baked

play17:43

into the jnai technology

play17:46

so I think that's the first the second

play17:48

one is we actually started this project

play17:50

on

play17:53

GPT 3.5 and we got really good results

play17:56

and then we elevated and moved on to

play17:59

gpd4 and we got really good results as

play18:02

well as well and then when Google

play18:04

released Palm 2 we decided to try it out

play18:07

and so far the results have definitely

play18:09

been better than what we've seen with

play18:11

open Eis models absolutely Okay so we've

play18:15

talked about use cases we've talked

play18:17

about like technical you know Solutions

play18:19

but at the end of the day it's all about

play18:21

that business value right like how do

play18:22

you actually see those business results

play18:24

Donna like I want to talk to you about

play18:27

this like start with you

play18:29

how do we see those business results and

play18:30

can you give me some examples because

play18:31

without them it's kind of pointless

play18:33

right like you want to see those results

play18:34

so I'm good

play18:36

yeah um so let me start with Alpha fold

play18:40

um so in the case of alpha folds the the

play18:44

customers were able to conduct

play18:45

experiments much faster get much quicker

play18:48

insights and also minimize the high

play18:52

ratio of failures from more traditional

play18:54

methods and I mean the impact was really

play18:57

incredible to see and it's also one of

play18:59

the reasons why I'm in this field

play19:00

because they were able to accelerate the

play19:03

drug Discovery process both biotech and

play19:06

pharmaceutical companies alike

play19:09

um more generally I would say within the

play19:10

generative AI space we see customers

play19:14

getting benefits in terms of employee

play19:16

productivity allowing them to focus on

play19:19

more value-added tasks we see more

play19:22

intuitive experiences being built with

play19:24

generative Ai and also better user

play19:26

experiences and then also new insights

play19:30

or new ideas that were impossible before

play19:32

but I think arvin's and Ignacio will

play19:34

have great input here Armin we had two

play19:38

overarching goals for this specific

play19:40

project so the first one was obviously

play19:43

the accuracy of the data and the results

play19:46

and then

play19:47

the cost of

play19:51

building and maintaining this feature

play19:54

and what we've found

play19:57

was using llm and gen AI we achieved

play20:01

both goals so sort of to step back we

play20:04

attempted to solve this problem last

play20:07

year using traditional AI techniques and

play20:11

like I mentioned you know you're talking

play20:12

of thousands of product catalogs we have

play20:15

around 5000 Plus classes in the Google

play20:19

taxonomy and trying to build a model

play20:23

using that is extremely difficult and

play20:26

now to scale that across multiple

play20:28

verticals within the retail space makes

play20:32

it even more challenging so for us

play20:36

it was extremely expensive both from a

play20:38

human time perspective

play20:41

computational resources and you know the

play20:45

amount of data we needed to train and

play20:48

build those models and a lot of those

play20:50

were solved with Jin AI absolutely

play20:54

all right I think that I will I will

play20:57

talk about three dimensions of benefits

play20:59

first I will go to the use case and the

play21:01

reason why I'm here which is that

play21:03

summarization and the the understanding

play21:06

on on what the customers are calling us

play21:09

and then I will go to the three areas

play21:12

where we are working on generative AI

play21:14

because that's one use case but really I

play21:16

mean we are going ahead in in many areas

play21:19

and finally from technology point of

play21:21

view the benefits about velocity and

play21:23

cost of building because that is what

play21:26

I'm responsible for in order to learn

play21:27

more on the on the recent side so the

play21:30

these particular use case which is the

play21:32

summarization on all the calls and why

play21:35

people is calling to our call centers is

play21:37

very simple we were not able to do it

play21:39

before so it's a paradigm change in the

play21:42

past that was impossible to do

play21:44

simple as that

play21:46

um and now we can do it and that

play21:49

information has changed completely our

play21:50

understanding and our relation and

play21:52

proximity with our customers because we

play21:55

were doing surveys and getting scores

play21:58

and getting some comments and then it

play22:00

was a team trying to understand what

play22:01

does that means and then we were cross

play22:03

uh Crossing that data with technical

play22:05

problems that we have and then making

play22:07

interpretations and you you should see

play22:09

the debates that we had in our customer

play22:11

boards that we create to try to

play22:14

understand and be better and now we have

play22:16

three really really granular information

play22:18

that is precise is what they are saying

play22:20

it's not what we think that they're

play22:22

saying so that is a paradigm change I

play22:24

cannot put money on that but we are

play22:26

trying to reduce 30 percent our lead

play22:28

detractors and we are in in that track

play22:30

so he's doing very well then if I go to

play22:33

the second one which are the different

play22:35

areas I would say we are working on all

play22:38

the checkbox and interaction with the

play22:41

customers it's early days we see an

play22:44

incredible opportunity in there so we

play22:47

have checkbox but we have to invest a

play22:48

lot of money and time and training the

play22:50

different languages and hence the

play22:53

explanation about the fact that we we

play22:55

operate in a multi-language is is

play22:57

expensive it's not precise the

play22:59

experience for the customer is not

play23:00

correct and our early tests are showing

play23:03

that this can be again a paradigm change

play23:06

in comparison with what we were doing

play23:08

before so one big area by the different

play23:10

domains of customers operations all the

play23:13

different call centers that we have is

play23:14

is a chat box the second one is what we

play23:18

call copilot so it's making sure that

play23:20

the people can really do their job

play23:21

better and that imply a lot of areas so

play23:25

we're talking about

play23:27

if you are in my team in the in the

play23:30

technical team coding yeah and we see

play23:34

productivity that's 10 to one in

play23:36

comparison with no use in it I still

play23:37

experimental I'm no I'm not talking

play23:39

about

play23:40

details but

play23:42

legacies so we grow through mergers and

play23:44

Acquisitions so we have a lot of

play23:46

legacies with documentation that is not

play23:48

existence and knowledge that is

play23:50

disappearing of there

play23:52

we are testing how to get that those

play23:55

system documented and that change again

play23:58

our ability to to run operations in a

play24:00

different way or to maintain systems

play24:02

that were not possible to maintain so we

play24:04

are experimenting in what we call

play24:05

copilot and you can add every different

play24:07

business area and we are doing

play24:10

experiments on that and then it's

play24:11

Knowledge Management we're a massive

play24:13

company we have a lot of information

play24:15

managing knowledge is is a big problem

play24:17

for us and doing it well can change

play24:19

completely how our customers are

play24:22

receiving our services and that is a

play24:24

certain area so the the first benefit

play24:27

was in the in the proper bit the second

play24:29

is in in these three areas and finally

play24:31

from me something like vertex Ai and all

play24:33

the engineering is the velocity so the

play24:36

fact that we were able to do in in weeks

play24:40

the real application that is working and

play24:43

for taking all the calls in a full

play24:46

country is just because the tools are

play24:48

there and the cost and the velocity of

play24:50

the plane is is helping us to to really

play24:53

get the value yeah

play24:55

I really like this framing of again it's

play24:58

it's what are you what are you changing

play25:00

how much does it cost how effective is

play25:02

it and then can you measure that right

play25:04

because otherwise it's just a boondoggle

play25:05

you're spending money and who knows

play25:07

what's happening but then you can prove

play25:08

it you can double down on it and build

play25:10

on top of it absolutely Uh Kevin uh can

play25:13

you talk a little bit about what you

play25:15

learned along the way of this magical

play25:17

genie Journey

play25:19

um in short I learned that we have to

play25:22

continue learning I mean there's just so

play25:23

much to learn I mean think about where

play25:25

we were six months ago right look at

play25:27

where we are today you can imagine

play25:28

what's like six months from now right so

play25:31

it's a new normal The New Normal is

play25:32

you're gonna really have to keep the

play25:34

learning up

play25:36

um but then the good news is

play25:37

accessibility right imagine think about

play25:40

a year ago how could you get your hands

play25:42

on to llms right you have to spin up

play25:44

your own VMS spin up there you know just

play25:47

set it all up and download some

play25:48

framework install it yourself before you

play25:50

can even prompt it all right and you

play25:52

probably have to troubleshoot a bunch of

play25:53

libraries and stuff like that along the

play25:55

way I mean we've gone through that so

play25:56

yeah

play25:57

um today it's available everywhere all

play26:00

the cloud vendors have it we have cohere

play26:01

we have anthropic it's there's really no

play26:04

excuse to not have that hands-on

play26:06

experience right so go for it you know

play26:09

and there are all these you know if you

play26:10

sign for free and try it out right so

play26:12

definitely try it out and we have

play26:13

Frameworks like Lang chain to help you

play26:15

build up these applications as well

play26:16

right so go Hands-On I think the last

play26:19

thing also is we used to say read the

play26:21

manual read the manual right I think you

play26:22

know in this world we may still want to

play26:24

want to start saying read the papers

play26:26

read the papers papers as in research

play26:28

papers right so you'll get a lot of

play26:30

insight of what's going on right so this

play26:33

a lot of this started with a Transformer

play26:35

paper back in 2017. it's still a really

play26:37

good paper to read right so there are

play26:39

good papers

play26:40

um if you don't have time read the

play26:41

abstract all right but I think that's

play26:43

where you could really get up to speed

play26:45

on what's coming down the pipe I would

play26:48

used to go to the Google AI blog and

play26:50

just read all the papers they were

play26:51

publishing because I was like listen if

play26:53

Google's publishing it it's probably

play26:54

state of the art so yeah that's that's

play26:56

how I used to cheat sheet my way through

play26:58

yeah exactly the papers or you could run

play27:00

it through an llm to summarize get a

play27:02

summarization right yeah

play27:06

um

play27:07

I think for a subset of problems like

play27:10

the one that I mentioned I think llms

play27:12

provide a phenomenal Foundation because

play27:15

there is so much information encoded in

play27:18

these models because there's craned on

play27:20

so much relevant data it makes the lift

play27:24

for us a lot easier right like I

play27:28

mentioned we tried this with traditional

play27:30

Ai and it was really not sustainable for

play27:34

us to scale this and what uh gen AI is

play27:39

now able to provide or the tools for us

play27:41

to be to not only solve but able to

play27:44

scale problems such as the one I

play27:46

mentioned

play27:48

for me three things one is is really

play27:52

transformational everybody can see it

play27:54

don't try to centralize it in Ito and in

play27:57

any area let the people experiment so

play27:59

make sure that you create a safe

play28:00

environment for experimentation and you

play28:02

can measure that to then decide where

play28:05

you're going to invest but let the

play28:07

people play because the people need to

play28:09

play don't don't restrict in the other

play28:12

hand investing in Foundation because it

play28:14

is the only reason or the only way to

play28:18

get real value is combining the LL

play28:21

models with your own data so investing

play28:23

foundation invests on getting the right

play28:25

architecture because then you can go

play28:26

super fast when the experiment shows

play28:29

that these are use case is worth 2x

play28:31

scale so that that is my my learning

play28:34

thing and sorry one more stick with your

play28:37

CEO make sure that they they understand

play28:39

the potential the value and they are no

play28:43

um bombarded by vendors telling them

play28:45

that is a

play28:47

bullet point a silver bullet to resolve

play28:50

every single problem so invest in

play28:52

education in based on bringing the the

play28:53

whole company and do not stop do not

play28:56

centralize it because then you kill it I

play28:59

I've heard so many customers say the

play29:00

board of directors the CEO have told me

play29:02

we need to do this ASAP and so yeah it's

play29:05

in every single boardroom every single

play29:07

you know CEO office

play29:09

um okay one last question and then we'll

play29:11

open it up to q a so you know get ready

play29:13

there are mics here in case you want to

play29:15

ask questions but last question advice

play29:18

any other pieces of nuggets of wisdom

play29:20

that you have for the audience here in

play29:21

terms of business use cases

play29:24

um sure yeah so I would say and I think

play29:27

we've all touched on this already but is

play29:29

experiment and iterate so generative AI

play29:32

truly has the potential to transform

play29:34

each industry we're living in a very

play29:36

exciting time but nothing really

play29:38

substitutes hands-on experience right to

play29:41

understand what the value is

play29:42

specifically for your business and also

play29:45

to understand the limitations so I would

play29:48

look within an organization and

play29:49

customers that we've seen be successful

play29:52

do this is really look at who's excited

play29:54

to Pioneer this which domain experts and

play29:56

which machine learning experts get them

play29:58

together to identify what does success

play30:01

look like to work on the use case and we

play30:04

did this as well so we actually worked

play30:05

with an SRE team to on the postmortem

play30:09

search and summarization use case where

play30:11

we worked with experts that really

play30:12

helped Define what success looks like if

play30:15

our outputs are good we started with a

play30:18

very small set of experts we gradually

play30:19

expanded to a trusted tester group we

play30:22

started with a small data set of 100

play30:25

postmortems we expanded to a thousand

play30:27

postmortems and now we're rolling this

play30:29

out more broadly and so um that that

play30:32

would be my advice is just get started

play30:37

um

play30:39

so I I consider uh gen AI or llm as

play30:43

being yet another tool in your toolbox

play30:46

right so as you go through building

play30:48

features I'm sure you have some success

play30:50

metrics and if Jin AI is going to help

play30:52

you reach those metrics and those goals

play30:55

then yes that's the right tool right so

play30:58

that's the first thing and given as you

play31:00

mentioned Kevin like how rapidly things

play31:02

are changing uh pick use cases that has

play31:07

a tolerance for getting it wrong so for

play31:09

example pick use cases that are internal

play31:12

facing or pick use cases that have a

play31:15

human in the middle so you can Rectify

play31:18

if there are issues uh the last one I

play31:21

would say is

play31:23

invest in tooling and technology and

play31:26

platform what I mean by that is today we

play31:28

run 20 plus models we have customers

play31:31

asking us questions about the

play31:33

performance the output of the models we

play31:36

uh our teams have to debug fix issues so

play31:39

we've built a lot of tooling to be able

play31:42

to observe and scale the systems so I

play31:46

keep him for uh reminding my team if

play31:49

you're gonna build a ship be prepared

play31:52

for a shipwreck right so while you're

play31:55

yes focused on llm engine AI ensure you

play31:59

have enough tooling and platforms to

play32:01

support that

play32:03

I'm sorry nothing else to add I think is

play32:06

I I already mentioned everybody is if

play32:08

that is precisely that I think my piece

play32:12

of advice is if anyone tells you what

play32:13

they know what channel is going to be

play32:15

doing in five years they're lying to you

play32:16

so you know stay humble stay stay

play32:18

attached to the actual you know

play32:20

short-term use cases because it's I'm

play32:22

sure it's gonna be crazy in five years

play32:23

but but we're we're to your point we're

play32:26

all just like wait what did we announce

play32:27

so absolutely all right uh we are out of

play32:31

time uh love it if you uh you know give

play32:33

us feedback here and I think we'll be

play32:35

around if you have any other questions

play32:36

as well thank you everyone

play32:38

foreign

Rate This
โ˜…
โ˜…
โ˜…
โ˜…
โ˜…

5.0 / 5 (0 votes)

Related Tags
AI BusinessGenerative AIIndustry LeadersGoogle CloudVodafoneBlue CoreTechnical StrategiesData AnalyticsProduct CatalogingCustomer ServiceGlobal Scale