Unleashing the Power of Gen AI and XR at Pfizer

AWE
26 Jun 202430:30

Summary

TLDRThe video script details Fizer's journey in integrating XR and AI technologies to revolutionize manufacturing. George Hanaras introduces the Smart Factory team's efforts to enhance efficiency and reduce human error by leveraging AR and AI. Nicholas Hawley discusses the development of XR training to virtualize the manufacturing shop floor, improving accessibility and retention. San Sharma highlights the technical stack and AI's role in content creation and spatial awareness. The team shares insights on strategic collaborations, user adoption, and the importance of data standardization for effective AI integration.

Takeaways

  • ๐Ÿ˜€ George Hanaras from Fizer's Smart Factory team discussed leveraging XR and AI technologies to improve manufacturing efficiency and reduce human error.
  • ๐Ÿ› ๏ธ The team identified pain points in manufacturing, such as complex procedures and environments, the need for guidance for operators, and issues with siloed systems.
  • ๐Ÿค– They explored AR as a solution, experimenting with various hardware and software options to create immersive experiences that could be deployed in a regulated environment.
  • ๐Ÿ”ง The decision was made to opt for an in-house development model, utilizing the Unity 3D engine and forming agile teams to develop and deploy AR solutions.
  • ๐Ÿ”— Emphasis was placed on integrating systems and data to bring the right information to the right user at the right time, standardizing and democratizing data for immersive experiences.
  • ๐Ÿ“š The Smart Factory team recognized the need for digitizing tacit knowledge and providing end-to-end visibility of processes and tasks for operators.
  • ๐Ÿ† The team has won awards for their work and is now exploring the integration of AI with AR, using predictive models and large language models to enhance manufacturing operations.
  • ๐ŸŽ“ Nicholas Hawley highlighted the use of XR for training, emphasizing the importance of extending the reality of manufacturing assets to people for better training outcomes.
  • ๐ŸŒ The Fizer Verse platform was introduced as a centralized hub for training content, enabling multi-user experiences and real-time interaction with trainers in virtual environments.
  • ๐Ÿ›‘ The journey of implementing XR in training involved starting with research, building proof of concepts, and eventually bringing development in-house for cost efficiency and agility.
  • ๐Ÿ”ฎ Looking forward, the team is exploring AI in XR for asset optimization, generative voice and avatars, and the potential for AI to improve the user experience in manufacturing.

Q & A

  • Who is George Hanaras and what is his role in the Smart Factory team?

    -George Hanaras is a member of the Smart Factory team at Fizer's diesel manufacturing organization. He is involved in leveraging innovative technologies like XR (Extended Reality) and AI to support manufacturing plants and improve front-line efficiency.

  • What is the primary goal of the Smart Factory team in terms of technology implementation?

    -The primary goal of the Smart Factory team is to utilize innovative technologies such as XR and AI to support manufacturing plants, reduce human error, and enhance front-line productivity.

  • What challenges did Fizer's manufacturing plants face that led them to explore AR technology?

    -The challenges included complex procedures and environments, the need for operators to navigate multiple siloed systems for information, unpredicted issues without clear root causes or corrective actions, and a significant amount of tacit knowledge not being digitized.

  • How did the Smart Factory team approach the integration of AR technology in their operations?

    -They started by researching the market, identifying AR as a potential solution, and experimenting with various hardware and software components. They focused on high-value use cases and received positive feedback from early adopters, leading to the establishment of an in-house development model.

  • What is the significance of Unity 3D engine in the Smart Factory team's development process?

    -The Unity 3D engine is at the center of their internal delivery pipeline, allowing the team to quickly develop prototypes and deploy AR solutions in a compliant manner.

  • How does the Smart Factory team ensure that the AR solutions are aligned with business requirements?

    -They have formed internal agile teams consisting of 3D designers, artists, and developers who can convert business requirements into end-to-end AR experiences and deploy them compliantly.

  • What is the role of AI in the Smart Factory team's current initiatives?

    -AI is used to train predictive models to find correlations in historical manufacturing data, allowing for instant notifications of deviations during new production runs. Additionally, AI is used to generate targeted action points and convert textual output into AR experiences.

  • What is the purpose of the 'connected worker interfaces' developed by the Smart Factory team?

    -The connected worker interfaces aim to bring multiple systems together in a single pane of glass, enabling end-to-end visibility of processes and the visibility of the next critical task for the user, enhancing efficiency and reducing the need for manual data navigation.

  • How does the Smart Factory team address the issue of wearables not being mature enough for full shift adoption?

    -They decided to use mobile devices with tablets, specifically, mounted on trolleys or movable carts, allowing operators to perform complex tasks hands-free when needed while still benefiting from the AR element.

  • What is the vision of Fizer for the convergence of XR and AI in the future of manufacturing?

    -Fizer envisions XR to be the interface for AI on the shop floor, collecting data from various software processes into a single data channel, which then feeds into an AI system to make data smarter and more adaptive, aiding operators in making better decisions.

  • What are some of the technical considerations for deploying AR and VR applications in manufacturing as discussed by San Sharma?

    -The technical considerations include content creation from CAD drawings and 360ยฐ scans, spatial awareness setup using image tracking or plane detection, development platforms like Unity and xcode, SDK APIs like AR kit and VR interaction toolkit, and networking libraries for multiplayer experiences. Additionally, there is a focus on data integration from IoT sensors and the use of XR content management systems for asset distribution and access.

Outlines

00:00

๐Ÿค– Introduction to Smart Factory and AR Journey

George Hanaras introduces the Smart Factory team from Fizer's manufacturing organization, emphasizing their goal to integrate innovative technologies like XR and AI to enhance manufacturing plant efficiency and reduce human error. The team's mission is to address pain points identified by customers, especially during the COVID-19 pandemic, when scaling up production rapidly became necessary. They explored AR as a solution to complex procedures, siloed systems, and tacit knowledge not digitized. After positive initial feedback, they opted for an in-house development model using Unity 3D engine for rapid prototyping and deployment, focusing on high-value use cases and collaborating with third parties when beneficial.

05:01

๐Ÿ“ฒ AR Implementation and the Evolution of Wearables

The team at Fizer faced challenges with wearables' maturity for full-shift use, leading to a shift towards mobile devices and tablets. They developed connected worker interfaces to integrate multiple systems for end-to-end visibility of processes. The adoption of AI and AR together marked a new phase, where historical manufacturing data was used to train predictive models, enabling real-time issue notifications during production. The integration of AI with AR experiences was highlighted, focusing on adding value without creating noise for operators, particularly for complex tasks requiring guidance.

10:01

๐Ÿ› ๏ธ XR Training and the Manufacturing Shop Floor Virtualization

Nicholas Hawley, a product manager for XR training at Fizer, discusses the importance of extending the reality of Fizer's most valuable assets: people and the manufacturing shop floor. The team addressed issues in traditional training, such as limited access to manufacturing areas and lengthy, hard-to-retain SOPs. They implemented a four-pronged approach involving virtual twins, VR training, 360 video training, and the Fizer Verse platform, which allows multi-user experiences for training. The journey began with research and proof of concepts, leading to the creation of a self-service model and, eventually, a centralized platform for easy access to training content.

15:01

๐ŸŒ Technical Stack and Future of XR Application Development

San Sharma, the tech lead for the Smart Factory team, provides an overview of the technical stack used for building XR applications, including content creation tools, spatial awareness technologies, development platforms like Unity, and SDKs for AR and VR. He discusses the importance of data integration from IoT sensors and the use of networking libraries for collaborative experiences. Sharma also highlights the potential of AI in content creation, spatial awareness, asset retrieval, and conversational interfaces, envisioning a future where XR serves as the interface for AI on the shop floor, with data collection and AI systems supporting better decision-making.

20:02

๐Ÿ† Award Achievement and Generative AI Integration

The discussion includes the team's recent win at the Augie Awards for the best use of AI, attributed to their effective use of predictive and generative models based on historical manufacturing data. The team explores the use of large language models within a highly regulated space, emphasizing the need for secure versions of these models. They detail how these models are used to query data and create an easier interface for visualization, highlighting the synergy between AI and AR in their applications.

25:03

๐Ÿ”„ Pivots and Learnings in XR Deployment and Development

The team reflects on their journey since 2018, discussing the challenges and lessons learned in deploying XR systems. Key takeaways include the importance of aligning with vendor roadmaps, the need for strategic collaborations, and the challenges of user adoption, particularly with AR on the shop floor. They share insights on the transition from using headsets to iPads for comfort and practicality, and the importance of flexibility in development approaches to accommodate changing requirements and feedback.

30:03

๐Ÿ‘ Closing Remarks and Openness to Collaboration

In the final paragraph, the team concludes the session with an invitation for collaboration and feedback. They express gratitude for the audience's attention and participation, encourage questions, and highlight their openness to suggestions and partnership opportunities. The session wraps up with a round of applause for the presenters and a reminder that this was the last session for the day.

Mindmap

Keywords

๐Ÿ’กSmart Factory

Smart Factory refers to the integration of advanced technologies within a manufacturing plant to increase efficiency, reduce human error, and improve overall productivity. In the video, George Hanaras mentions being part of the 'smart factory' team at Fizer's diesel manufacturing organization, which aims to leverage technologies like XR (Extended Reality) and AI to support manufacturing processes.

๐Ÿ’กXR (Extended Reality)

XR, or Extended Reality, encompasses all real-world and virtual environments that are created by computer technology, including AR (Augmented Reality) and VR (Virtual Reality). In the script, the team discusses using XR to provide an immersive experience that guides operators through complex tasks, enhancing their ability to perform efficiently.

๐Ÿ’กAI (Artificial Intelligence)

AI, or Artificial Intelligence, is the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. The video discusses the use of AI in predictive models to analyze historical manufacturing data and identify deviations from expected patterns, as well as in generative models to provide targeted action points for users.

๐Ÿ’กManufacturing Efficiency

Manufacturing efficiency refers to the optimization of the manufacturing process to reduce waste and increase productivity. The script highlights the goal of leveraging innovative technology to make the front line more efficient by reducing human error and improving overall productivity.

๐Ÿ’กHuman Error

Human error refers to mistakes made by people during the manufacturing process that can lead to inefficiencies or defects in products. The video script discusses the use of technology to reduce human error by providing additional layers of guidance to operators performing tasks.

๐Ÿ’กPredictive Models

Predictive models are statistical algorithms that make predictions about future events or trends based on historical data. In the context of the video, predictive AI models are trained to find correlations within historical manufacturing data, allowing the team to anticipate and address deviations during new production runs.

๐Ÿ’กIoT (Internet of Things)

IoT, or the Internet of Things, refers to the network of physical devices, vehicles, and other items embedded with sensors that enable them to connect and exchange data. The script mentions using data from IoT sensors on the shop floor and integrating it into XR applications to enhance the manufacturing process.

๐Ÿ’ก3D Modeling

3D modeling is the process of creating a three-dimensional representation of an object or environment using specialized software. In the video, 3D modeling is used to create assets and animations for AR applications, which are essential for providing realistic and immersive experiences to users.

๐Ÿ’กAgile Teams

Agile teams are self-organizing, cross-functional groups that work in short iterations, known as sprints, to produce products in a flexible and efficient manner. The script mentions the formation of internal agile teams consisting of 3D designers, artists, and developers to convert business requirements into end-to-end AR experiences.

๐Ÿ’กAR (Augmented Reality)

AR, or Augmented Reality, is a technology that overlays digital information or images onto the real world, enhancing the user's perception of their environment. The video discusses the use of AR to provide guidance and instructions to operators on the shop floor, helping them perform tasks more effectively.

๐Ÿ’กVR (Virtual Reality)

VR, or Virtual Reality, is a computer-generated simulation of a three-dimensional environment that users can interact with. The script describes the creation of immersive VR training spaces for shop floor workers, allowing them to interact with virtual objects and learn procedures in a simulated environment.

๐Ÿ’กGenerative AI

Generative AI refers to artificial intelligence systems that can create new content, such as text, images, or audio, based on learned patterns. In the video, generative AI is used to create targeted action points by scanning through a large knowledge base of work instructions and user logs, and to convert textual output into AR experiences.

Highlights

George Hanaras introduces the Smart Factory team from Fizer's manufacturing organization, focusing on leveraging XR and AI to improve manufacturing efficiency.

The team addresses pain points from manufacturing plants, especially during the COVID-19 pandemic, where complex procedures and environments were challenging for operators.

Smart Factory explored AR as a solution for the identified issues, adopting a startup mentality to quickly test new technologies.

Hardware and software trials for AR included wearables, mobile devices, tablets, headsets, QR codes, image targets, 3D scans, and cloud anchors.

In-house development was chosen for AR solutions due to complex customer requirements and a highly regulated space.

Unity 3D engine is central to the internal delivery pipeline for developing and deploying AR solutions.

Data standardization and democratization are key to bringing the right information to users in AR experiences.

The Connected Worker interface aims to integrate multiple systems for end-to-end visibility of processes and tasks.

Smart glasses were deemed immature for full shift use, leading to a preference for mobile devices and tablets on movable carts.

AI and AR integration is the current phase, using predictive models to identify deviations from historical patterns in manufacturing.

Large language models (LLMs) are deployed to scan through SOPs and work instructions, generating targeted action points for users.

Textual output from AI is being converted into AR experiences for seamless transition from 2D to immersive guidance.

Nick Hawley discusses the XR training journey at Fizer, emphasizing the importance of extending reality for valuable assets like people.

XR training aims to solve issues like limited access to manufacturing areas and lengthy, hard-to-retain SOPs.

Virtual twins, 3D scanning, and custom VR training are part of the multi-pronged approach to XR training at Fizer.

Fizer Verse is a centralized platform for hosting and accessing all XR training content, enabling multi-user experiences.

San Sharma, the tech lead, outlines the technical stack used for building XR applications, including content creation and development platforms.

AI is seen as crucial for improving content creation, spatial awareness, asset retrieval, and conversational interfaces in XR applications.

Fizer envisions XR as the interface for AI on the shop floor, with a focus on data collection, AI integration, and operator guidance.

The team discusses lessons learned, such as the importance of strategic collaborations and user adoption, as well as the shift from AR headsets to iPads for comfort.

The decision to bring development in-house was driven by the need for agility, cost efficiency, and the ability to quickly iterate based on user feedback.

Transcripts

play00:03

hello everyone um let's just wait for a

play00:06

few people to sit I see they're coming

play00:13

now so hello everyone I'm really happy

play00:16

to be here I hope uh everyone enjoyed

play00:19

the show so far and I want to thank you

play00:21

for staying until the end to share a

play00:24

speak um my name is George hanaras and

play00:28

uh I'm part of uh d team called smart

play00:31

Factory led by Ron Kelly H we're part of

play00:34

fizer's diesel manufacturing

play00:36

organization and our overall goal is to

play00:39

leverage Innovative technology like XR

play00:41

and AI to support our manufacturing

play00:44

plants and to enable our front line to

play00:46

be more efficient by reducing human

play00:49

error and improving their over

play00:52

productivity uh for today's uh

play00:55

presentation we want to put ourselves in

play00:57

the place of the audience and think of

play00:59

what would be valuable to share when

play01:01

we're trying to implement Innovative

play01:03

technology in manufacturing what are

play01:05

some of the roadblocks what are some of

play01:08

the best practices we can follow to be

play01:12

successful uh so with that I will start

play01:14

with uh our air Journey so far and uh

play01:19

I'll take it from the beginning uh we

play01:21

had our customers our manufacturing

play01:22

plants coming to us with some big pain

play01:24

points that they had

play01:26

identified especially during Co when we

play01:29

were trying to scale up at really high

play01:31

Pace uh to meet our Global uh market

play01:33

demand we're seeing that our comp that

play01:36

our procedures and also our environments

play01:39

were becoming more and more complex and

play01:41

our operators needed that extra layer of

play01:43

guidance to perform their tasks we're

play01:46

also seeing that uh usually they needed

play01:48

to navigate between multiple silot

play01:50

systems to get the information they

play01:52

needed in order to make uh their

play01:54

decisions in addition sometimes there

play01:57

were issues that we could not predict

play01:59

and when those issues did occur we

play02:01

didn't have a clear root cause or a

play02:03

clear corrective course of

play02:06

action finally we saw there was lot a

play02:08

lot of tacd knowledge that was not

play02:10

digitized and was living with our more

play02:12

experienced

play02:14

smmes so what we did as smart Factory we

play02:18

uh researched the market and we

play02:20

identified augment reality as a

play02:22

potential technology that can solve this

play02:24

issues we always have this startup

play02:26

mentality want to quickly try out the

play02:28

latest and greatest Tech and see if we

play02:30

can get value out of them so we

play02:32

approached they from both hardware and

play02:35

software on the hardware side we triled

play02:38

out um a lot of devices wearables uh

play02:41

mobile uh devices tablets and headsets

play02:44

and on the software side uh we tried out

play02:47

the different components that can enable

play02:48

this High experiences like uh QR code

play02:51

image targets uh 3D scans and Cloud

play02:54

anchors we initially focused on the more

play02:57

uh high value use cases and uh we're

play03:00

happy to see that the initial feedback

play03:02

from our uh early adopters was really

play03:05

positive and uh the demand was adding

play03:09

up so in order to meet uh the complex uh

play03:14

requirements of our customers and uh

play03:17

we're in a space where it's uh really uh

play03:20

highly regulated we have some additional

play03:23

restrictions what made sense for us is

play03:25

to opt for an in-house uh development

play03:27

model uh so we set up our own internal

play03:31

uh delivery pipeline uh at the focus of

play03:33

it we have uh the unity 3D engine that

play03:36

allows us to quickly develop prototype

play03:38

and deploy Air

play03:40

Solutions we formed our own internal uh

play03:43

agile teams consisting of uh 3D

play03:46

designers artists and developers that

play03:48

were able to take the business

play03:50

requirements and convert them to end to

play03:53

end air experiences and then deploy them

play03:55

in a compliant

play03:57

way uh we don't want want to reinvent

play04:01

the wheel whenever it makes sense we

play04:02

also partner up with third parties and

play04:04

we're trying to also use existing

play04:06

solutions to fill the gaps in our

play04:10

processes our next Milestone uh was

play04:13

about uh systems and data so a basic

play04:17

aspect of augment reality is the ability

play04:19

to bring uh the right information to the

play04:21

right user to the right location so what

play04:24

we did there is work closely with our uh

play04:27

supporting uh teams within digital

play04:29

Manufacturing

play04:30

and uh we had a goal to uh standardize

play04:34

democratize and catalog all of our data

play04:36

and then add this speciaal uh context to

play04:39

them so as a result we can bring them

play04:41

into our immersive experiences we also

play04:44

put a lot of focus in our um user

play04:47

Journey research and our Persona mapping

play04:49

so we came up with these connected

play04:51

worker interfaces the main concept of it

play04:53

is to be able to bring multiple uh

play04:56

systems together in a single pane of

play04:58

glass and enable that end to end

play05:00

visibility of processes and also the

play05:02

visibility of the next critical task for

play05:05

its

play05:07

Persona we also came to realize that the

play05:10

wearables the smart glasses were not

play05:12

that mature yet to be fully adopted and

play05:15

um they could not be used for a full

play05:17

shift so that's why we decided to go

play05:19

with mobile devices with tablets

play05:21

specifically and as you can see we opted

play05:24

for this setups that we can have maybe a

play05:26

trolley or a movable cart on the floor

play05:30

um the iPad is set up on top of it and

play05:32

we can have we can still have this Air

play05:34

element uh but our operators can also

play05:37

perform the more complex tasks with

play05:39

their handsfree when

play05:41

needed so this takes us uh to today uh

play05:45

after a lot of trying and a couple of O

play05:47

Awards later we are in a phase where we

play05:49

are using Ai and AR together so the

play05:53

first step was to have our data in the

play05:55

right format and then we were able to

play05:58

train our pred predictive models our

play06:00

predictive AI models and find

play06:02

correlation between all of our

play06:04

historical manufacturing data so what we

play06:07

are doing now is H during a new

play06:09

production during a new campaign or a BS

play06:12

we can compare to those historical uh

play06:14

patterns and if we see that we're

play06:16

deviating from them we can instantly

play06:18

notify our users of an

play06:20

issue then we're bringing gen into the

play06:23

mix so we're deploying LM models that

play06:26

can quickly scan through our large know

play06:29

base of uh Sops work instructions uh

play06:33

user logs and generate targeted action

play06:36

points for our users we are then taking

play06:39

it to a step further we're converting

play06:41

this textual output uh to an AR

play06:43

experience so we're enabling this

play06:45

simless transition from the 2D interface

play06:47

to the more immersive location based uh

play06:51

experience something to note here is we

play06:54

don't want to add AR everywhere but only

play06:56

when it makes sense um so uh we don't

play06:59

add extra noise to The Operators only

play07:01

when there's those more complex tasks

play07:03

that require this extra level of

play07:05

guidance then we jump into their

play07:09

layer

play07:10

um we're still in baby steps here on

play07:14

this field uh as we believe the whole

play07:17

industry but we're already excited to

play07:19

see some value coming out of it uh we

play07:21

were really happy to share also during

play07:23

the keynote and throughout the week that

play07:26

uh there are many that believe that AR

play07:29

will be the the interface to Ai and uh

play07:32

this for sure uh needs the hardware to

play07:34

evolve in the same way so we're hoping

play07:36

to see maybe wearables in the future

play07:38

enabl this uh in a better way but uh for

play07:41

us uh it will still be a big part of our

play07:44

future road map and we're excited for it

play07:47

so I would like to pass it over now to

play07:48

Nick to talk about our XR training space

play07:51

thank you

play07:56

everyone thanks George hey everybody I

play07:59

am uh Nicholas Hawley and I'm a product

play08:01

manager for our uh XR training at fizer

play08:05

U just want to say thank you to everyone

play08:07

for being here sticking it out to the

play08:08

end you know what they say they say you

play08:10

save the best for last or maybe in our

play08:13

case the last one is the request for a

play08:15

speaker spot but um here we are um but

play08:19

uh yeah so XR training um you know

play08:22

before I get into some of the specifics

play08:24

around it I'd like to say you know

play08:26

around the mission and um you know think

play08:29

the one thing that XR opens up for for

play08:32

fizer and um uh for manufacturing as a

play08:36

whole is really extending the reality

play08:39

like literally extending the reality of

play08:41

our most valuable assets and those are

play08:43

people and that's our manufacturing shop

play08:45

floor our manufacturing shop floor is

play08:48

our lifeblood of our operations um and

play08:51

if we don't have that then we have no

play08:53

fiser and if we don't have people we

play08:55

have no fiser so what we've been doing

play08:57

in the extended reality training space

play08:59

is is actually bringing and and

play09:02

virtualizing our manufacturing shop

play09:04

floor and bringing it to the people that

play09:06

need the information to perform their

play09:07

jobs and uh their processes on the line

play09:11

um so some of the problems that we saw

play09:13

around the training space before XR was

play09:16

uh limited access to our manufacturing

play09:18

areas um the the uh requirement to have

play09:22

multiple repetitions on the physical

play09:24

equipment that's very hard to access on

play09:26

the training lines um lengthy Sops 100

play09:29

plus pages long um that really uh didn't

play09:32

have a lot of retention when people were

play09:34

going through it um and human error that

play09:36

can cause production

play09:38

delays um so some of the ways that we

play09:41

actually approach this to uh to solve

play09:43

this um it's kind of a four-pronged

play09:46

approach um is our virtual twins um

play09:49

which you can see in the video right

play09:50

here where um we actually started 3D

play09:52

scanning all of our manufacturing

play09:54

environments and really that's our

play09:56

backbone to uh um to all of our uh VR

play10:00

training that we build as well um so

play10:03

this is for as George mentioned our

play10:05

anchoring for our AR and also we use to

play10:07

actually develop our custom 3D assets on

play10:09

the manufacturing L as well and this

play10:12

enables our custom VR training which you

play10:14

can see here where we actually

play10:15

completely virtualizing and creating

play10:17

immersive interactive VR training spaces

play10:20

for our our shop floor uh workers um

play10:23

they're able to interact with objects uh

play10:25

within the space and actually go through

play10:27

step by step the the um trainings that

play10:30

they need to carry

play10:31

out um another area that we're expanding

play10:34

into is um the is 360 video training as

play10:39

well um so you know depending on the use

play10:41

case and the scenario it might make more

play10:43

sense to actually leverage 360 video

play10:46

like you can see here um instead of a

play10:48

fully immersive interactive um training

play10:50

experience and there's also a cost

play10:52

differentiator there as well um and and

play10:55

lastly the the kind of the backbone

play10:58

behind all of that is our fiser verse

play10:59

platform which hosts all of this content

play11:02

makes it able for you makes it makes it

play11:05

uh make enables you to actually create

play11:07

multi-user experiences where a GMP

play11:09

qualified trainer can join the same

play11:11

space as uh a trainee um and actually

play11:14

have real-time dialogue over the virtual

play11:16

equipment and kind of even quiz them and

play11:18

also ask them questions um so they can

play11:20

actually understand if they uh if they

play11:23

know the procedure that they're going

play11:24

through before they even can step foot

play11:26

on the line um so so really uh you know

play11:30

there's multiple pronged approach to

play11:33

really extending the reality right we're

play11:35

this is we're extending reality here and

play11:38

um if we can create take our assets you

play11:41

know the production floor and extend

play11:43

that to the people that needed at the

play11:44

time they needed um we unlocking a lot

play11:47

of value that

play11:49

way um but to get there and to get even

play11:51

just the clips that you saw there it was

play11:53

a journey much like the the AR Journey

play11:55

as well um in 2018 starting with a lot

play11:58

of research we we were looking across

play12:00

the market understanding the immersive

play12:02

technology that was available

play12:04

identifying use cases to understand

play12:06

really how um XR VR Etc was going to

play12:10

make an impact on on what we were doing

play12:13

um within digital

play12:15

manufacturing um in 2019 we started to

play12:17

partner with a few vendors and we built

play12:19

some proof of Concepts in the VR

play12:21

training um in the 3D scanning space um

play12:24

really just to kind of understand

play12:26

conceptually and have something that we

play12:27

can internally Market to get in front of

play12:30

the right folks so they could see

play12:32

conceptually how this could be applied

play12:34

in in

play12:35

manufacturing in 2020 um obviously it

play12:38

was uh the year that covid hit so there

play12:40

was um a lot of uh demand to move to

play12:44

more virtual learning formats um and

play12:47

actually in 2020 some of those proof of

play12:49

Concepts that we built with the 3D

play12:50

scanning technology we were able to show

play12:53

the right people in in uh in the B at in

play12:56

the business at fiser and they wanted to

play12:58

adopt the the early days of that virtual

play13:01

twin technology that you saw and now

play13:03

we're at 16 sites globally with over

play13:05

2,000 scans um so this we actually

play13:08

created a self-service model where we

play13:10

were able to actually send the scanners

play13:12

to the sites the sites were able to

play13:14

replicate these areas and show the

play13:16

people um on the actual production that

play13:18

worked on the production floor and

play13:20

externally what was happening um inside

play13:22

of these areas so this is really kind of

play13:25

the the first moment where we enabled

play13:26

desktop XR I guess you could say um more

play13:29

people were experiencing this extended

play13:31

reality of a production floor um on

play13:34

their

play13:36

desktop and in 2021 much like the AR

play13:39

Journey this is where we started to

play13:40

bring our development inhouse because

play13:41

one of the things we realized in 2019

play13:44

with the vendor bills is there was a lot

play13:45

of ongoing cost when trainings were

play13:47

changing um so there was a a lot of cost

play13:51

efficiency with actually um uh building

play13:55

them internally and managing the code

play13:56

base um moving forward so the this is

play13:59

where we started to build some of those

play14:00

custom VR trainings you're seeing and

play14:02

and really uh reducing uh the cost of

play14:05

the development and also reduce seeing

play14:07

some of that value um around the

play14:09

reduction of training time reduction in

play14:11

human error um for uh the trainings that

play14:14

we were

play14:16

building in 2022 um really was a focus

play14:20

on centralization um you know we started

play14:23

to have these different uh virtualized

play14:25

manufacturing environments that we had

play14:27

training in we have the the the code

play14:28

that we we own and that um we're we're

play14:31

able to iterate and build more and more

play14:33

trainings on we're building an asset

play14:35

base um to be able to enable these

play14:38

trainings um but we didn't have a

play14:40

centralized way for people to start

play14:41

accessing them we also didn't have a way

play14:43

to enable that multi-user experience so

play14:45

this is when we created Fiserv verse um

play14:48

so Fiserv verse is like a you know

play14:50

essentially like a a menu Netflix

play14:52

Channel where you could go in and you

play14:54

see all the trainings for a certain area

play14:56

that we've built all the different areas

play14:58

are there you can join there as a free

play14:59

roam or you can actually go into the

play15:01

training experience with your trainer um

play15:03

to uh to go through the training um uh

play15:06

together or individually so really that

play15:08

was taking all the data that we've

play15:10

collected so far from the 3D enablement

play15:12

and the in-house design and centralizing

play15:14

that an easy to use centralized

play15:18

location and lastly that leads us to the

play15:20

2023 in the present where we're starting

play15:22

to actively explore XR and AI um so um

play15:27

you know from the VR perspective we're

play15:29

really seeing uh asset optimization

play15:32

being a really big uh player in the XR

play15:34

and AI space in terms of optimizing our

play15:36

3D asset uh pipeline um to be able to

play15:40

generate um textures and objects and

play15:42

things like that to reduce the amount of

play15:44

time it takes to actually build

play15:46

environments and the the objects that

play15:47

we're going to be using in the spaces

play15:49

feeding it reference image uh feeding a

play15:52

generative AI reference images so it can

play15:54

actually create the the objects that we

play15:56

need um generative voice so actually

play15:59

having the ability to speak with

play16:00

somebody um within a model that's

play16:03

trained to actually be an expert on the

play16:05

trainings in the space when they go in

play16:08

um to actually reduce the need for

play16:09

trainer interactions in there um and

play16:12

then also a generative Avatar so this is

play16:14

you know the full embodied AI Avatar

play16:17

that can take you through a training and

play16:18

you can have a real-time dialogue and

play16:20

conversation with so these are the areas

play16:22

that I think from an XR perspective are

play16:24

really excited to to explore in the

play16:25

future with uh with XR and AI thank you

play16:28

than you I will be passing it off to

play16:36

sain uh thanks Nick uh good afternoon

play16:39

everyone so I'm the cool techy guy

play16:41

here so hi hello everyone my name is San

play16:45

Sharma and I the tech lead for the smart

play16:47

Factory team and personally I'm very

play16:48

excited to talk about the interesting

play16:50

work which we are doing uh in fer uh in

play16:53

terms of XR and AI right so firstly uh I

play16:57

would like to give a quick overview of

play16:59

the current technical stack which we use

play17:01

for building XR application in fiser so

play17:04

basically we start with the content

play17:05

creation where we take the cad drawings

play17:08

the 360ยฐ scans from mport feed them into

play17:11

a 3D modeling tool like as Maya blender

play17:14

3D Max and create the assets and

play17:16

animations and then if you're building

play17:18

an AR application so you probably need

play17:20

to do a spatial awareness setup so

play17:22

depending on the environment uh and the

play17:24

use case we kind of use image tracking

play17:27

uh plane detection Cloud anchors area

play17:30

targets right um and in terms of

play17:32

development platform we use Unity

play17:35

because it's a more diverse tool and

play17:37

easy to use and we also use xcode for

play17:39

building our native uh UI uh um Apple

play17:43

applications basically in terms of SDK

play17:46

apis uh it depends on the requirements

play17:48

so if you're building an AR application

play17:51

we use AR kit AR core voria and if we

play17:54

are building VR applications we are

play17:56

using XR inter interaction toolkit and

play17:59

as Nick mentioned right we are building

play18:00

collaborative experiences so we probably

play18:03

end up using a networking libraries as

play18:05

Photon uh for the multiplayer stuff and

play18:08

uh yeah we also use some Cloud streaming

play18:11

uh for rendering 3D assets at runtime

play18:13

actually then that then that comes the

play18:16

data part that's a more key part so we

play18:18

kind of use the uh the data on the shop

play18:21

flows from the iot sensors and integrate

play18:23

into our uh exer applications using

play18:26

microservices API and connectors and

play18:29

once that's all uh done we kind of

play18:31

package it and we test out the

play18:33

application using test flight appls and

play18:36

once the validation teams done the

play18:38

testing part uh then it's packaged and

play18:40

deploy into production using MDM such as

play18:43

quest for business we have a good

play18:44

partnership with meta so yeah thanks to

play18:46

meta for that and uh yeah and one last

play18:49

uh key important piece is the XR content

play18:52

management system so you need a probably

play18:54

a 3D CMS to uh distribute manage and

play18:58

provide access to the 3D assets which we

play19:00

have created because we have lots of

play19:01

trainings lots of air application so we

play19:03

need a 3D CMS right so here what I

play19:06

really wanted to talk you about is how

play19:07

can we improve the building blocks here

play19:10

right and I see AI playing a crucial

play19:12

role here so for example for the content

play19:14

creation part right uh imagine you can

play19:17

uh create uh the 3D models on the Fly

play19:21

Right U using automated 3D modeling

play19:23

right what if you can create a v VR

play19:25

environment uh using a simple text or

play19:28

using uh existing images right wouldn't

play19:30

that be great uh we are already already

play19:33

doing that and uh you can create

play19:35

textures for your 3D assets that um so

play19:39

that that's a path we want to go in

play19:40

terms of content creation uh spatial

play19:43

awareness like right when we are doing

play19:45

the proof of Concepts when we're

play19:46

building the AR applications so we

play19:49

observe that in terms of tracking right

play19:51

uh the reflection the lighting plays an

play19:53

important role so when we go to the

play19:55

actual environment the tracking is is

play19:57

not as good what we expected to be and

play19:59

that's where we we think AI could play a

play20:02

crucial role you know helping us

play20:04

understand the environment where we are

play20:06

and you know improve the tracking

play20:07

process basically U we could also use AI

play20:11

for uh quick retrieval of assets tagging

play20:13

them um creation of personalized avas

play20:17

know having powerful conversation

play20:19

interfaces right so it's the limit uh

play20:22

the the limit is and endless here using

play20:24

a um um Ai and XR

play20:29

so what's next for us is uh in fisa we

play20:32

have a vision that uh we want uh uh XR

play20:36

to be the interface for AI so on a

play20:38

shoplow right so we have this industrial

play20:41

uh uh m4o architecture where we want to

play20:45

collect all the data from different

play20:46

software processes feed them into like

play20:49

one single uh data Channel and from that

play20:51

feed they feed into an AI genni system

play20:55

uh which can make our data smarter

play20:57

productive more adaptive R and it could

play20:59

be more helpful for our sh flows

play21:01

operators to make better decisions on

play21:04

the Shu flow and also we don't want to

play21:08

uh push XR we want them to use as a

play21:11

wherever they want to use it so for

play21:13

example if if the operator requires a

play21:15

location based guidance we can probably

play21:17

add AR there we can guide them and or

play21:19

probably overlay some AR instructions

play21:21

off top them on top of that right so

play21:24

yeah uh in fisa we really believe that

play21:26

XI and AI will converge

play21:29

and uh it it will play a crucial role uh

play21:31

shaping up the manufacturing shlow so

play21:35

yeah uh if you if you like our story if

play21:38

you want to be part of us and if you if

play21:40

you want to help us improve this

play21:41

building blocks of XR applications we

play21:43

are open to collaborations and yeah

play21:46

thank you so much and feel free to reach

play21:48

out to us we are we are staying after

play21:50

the talk here also so if you have any

play21:51

suggestions any collaboration

play21:52

opportunities we would love to hear so

play21:54

yeah thank you so much and we are open

play21:56

to taking any questions uh you have

play22:04

awesome just raise your hand if you'd

play22:06

like to ask a question all

play22:11

right um you won an Augie yesterday

play22:14

didn't you congratulations on that I

play22:16

wanted to know what you thought were the

play22:18

elements of your build that actually won

play22:21

you the

play22:22

Yogi so this was for the best use of AI

play22:26

if I'm not wrong and this uh I think

play22:28

it's related to what uh we shared with

play22:30

you today uh I think as I mentioned

play22:33

we're still in early stage there with

play22:36

Gen uh I think the boom happened

play22:38

sometime middle next year last year so

play22:41

everyone is trying to find a use case

play22:43

where we can actually add value through

play22:46

jni uh for us uh as we said we have this

play22:50

uh we are lucky actually to have this

play22:52

vast knowledge base of data of

play22:54

manufacturing historical data that we

play22:56

can take advantage from so as we say you

play22:59

can be better if you really know your

play23:01

past and this translates to uh our

play23:04

predictive and generative models so we

play23:06

learn from our mistakes from our uh past

play23:08

process and we can use that to make our

play23:11

new production runs better and this has

play23:15

a direct impact in the product we are

play23:17

delivering so I think this is why uh

play23:19

maybe this award came to us because

play23:21

we're really making an impact with AI

play23:23

and

play23:24

AR um can you talk a little bit more

play23:26

about what the user experience is in in

play23:29

using the product like how exactly is

play23:31

generative AI being

play23:34

used okay so we are using those large

play23:38

language models and uh as I mentioned

play23:40

earlier we're working in a highly

play23:41

regulated space uh we need to have a

play23:44

secure versions of those models So

play23:46

currently within fiser we have deployed

play23:48

our inter our private instances of uh

play23:51

those uh large language models and we

play23:54

use them to query our data so

play23:56

essentially talk to our data and air

play23:58

comes into place with creating that

play24:00

easier interface to talk to the data and

play24:03

then also visualize the result of that

play24:06

so this we see where we see the sync

play24:07

between the

play24:08

two thank

play24:11

you thank you for sharing your journey

play24:13

and your experience with us can you give

play24:16

us some examples of the scales of your

play24:18

deployment like how many devices how

play24:20

many people you put through the training

play24:23

and also uh obviously content creation

play24:26

and maintenance is especially in 3D is

play24:28

even more expensive than n videoid have

play24:30

you considered using AI to accelerate or

play24:33

optimize that thank

play24:36

you yeah I'd say for the the content

play24:38

creation it's something that we've

play24:39

already been actively exploring I think

play24:41

s mentioned on the the asset

play24:43

optimization and our pipeline of how

play24:45

we're creating our assets it's something

play24:47

that we're trying to integrate gen geni

play24:50

with so we can reduce those uh uh

play24:53

development times for for our assets um

play24:56

and in terms of our scale um you know we

play24:59

uh uh about 40ish manufacturing sites

play25:03

across our our Network and we have XR

play25:05

trading at about 20 of those so it's

play25:08

about half the network um that's uh

play25:10

that's leveraging some form of extended

play25:12

reality

play25:17

training hi uh thanks for the talk I

play25:20

would like to know about the layers of

play25:23

abstractions that you have to do

play25:26

to deploy this system system like what

play25:30

comes first uh the llm and everything is

play25:32

bested on llm

play25:34

or what are the other things just has to

play25:37

go parallel to each other to for for a

play25:40

better deployment of the

play25:43

system so the llm is a part of the

play25:47

overall uh product uh what I mentioned

play25:50

there is that it's really important to

play25:52

have the data in the right format so in

play25:54

order for an a model to work it's as

play25:56

good as your data so really big big

play25:59

focus is put there to standardize our

play26:03

data catalog and add this extra content

play26:06

context to them so specifically for AR

play26:08

add this uh um speciaal uh context to

play26:11

our data so then they can be surfaced by

play26:15

our um interfaces so this is the high

play26:18

level

play26:19

pipeline sorry I need a bit more uh one

play26:23

more clarification can I ask I mean if

play26:25

it does not violate the n

play26:30

obviously should uh so the special

play26:32

awareness comes from the llm or it's

play26:37

completely

play26:38

separate separate okay thank

play26:46

you any other

play26:51

questions uh hi thank you so much for

play26:53

your presentation um so your journey

play26:56

with all this started back in roughly

play26:58

2018 um so over what six seven years of

play27:03

time to build this entire pipeline in

play27:05

this Mass undertaking um I'm sure that

play27:07

you've had plenty of pivots um pitfalls

play27:09

you know things that you've had to learn

play27:11

over the uh over the course um what

play27:14

would you say in your opinion would be

play27:16

some of the biggest lessons or key

play27:18

takeaways that you've taken from having

play27:19

to make any changes like that over

play27:24

time maybe something on my side uh

play27:27

usually when working with with vendors

play27:29

uh we see there are multiple vendors

play27:30

that are supporting uh the Pharma

play27:32

industry or the manufacturing industry

play27:34

we need to make sure that uh they um

play27:36

road maps align with our so we' seen a

play27:38

few times that um some of our partners

play27:41

might change directions maybe they have

play27:42

multiple clients so we might not be

play27:44

their first priority so it's really

play27:46

important to make these strategic uh

play27:48

collaborations and be sync with the

play27:50

people you're partnering with I think

play27:51

this is something important and we had a

play27:53

few successes and maybe a few failures

play27:55

there but uh yeah we learned our lesson

play28:02

personally from my end I think it was

play28:04

the adoption part

play28:06

so like we are used to uh using headsets

play28:10

and all those cool things right but the

play28:11

operators on the shop flow they are not

play28:13

used to it so uh that's where the main

play28:15

challenge was so uh for VR it was good

play28:18

like they were they were ready to use it

play28:19

but like in terms of AR on the shop flow

play28:21

using a hollow lens it's a very bulky

play28:23

device right so that's where we did a

play28:25

shift change we started using iPads so

play28:27

they were like more comfortable to use

play28:29

iPads so yeah and I think still it will

play28:32

grow like once the technology is much

play28:34

better once the form factor is reduced I

play28:36

think they will realize the true

play28:37

potential of it and they will be uh okay

play28:40

to start using uh the headsets and

play28:42

everything but yeah think you want to

play28:44

add anything

play28:46

sure yeah I was just going to say you

play28:48

know flexibility um is definitely a big

play28:51

one too because I think when we we

play28:52

brought the development in house like we

play28:55

we figured it all out right but you know

play28:57

the funding clim change and being able

play28:59

to offer other lower cost Solutions like

play29:01

360 video training and diversifying the

play29:04

portfolio and the different options that

play29:05

you have for the business depending on

play29:08

the climate is is important as

play29:13

well we got time for one more oh there

play29:16

we

play29:19

go thank you so much um for your uh

play29:22

training development Journey uh what

play29:25

were the considerations that made you

play29:27

switch from from external vendor to

play29:29

Bringing everything in

play29:32

house yeah I think the first one um you

play29:35

know some of those initial proof of

play29:36

Concepts that we rolled out and we

play29:38

deployed um you know once the the end

play29:41

users started to get uh their hands on

play29:43

them they had a lot of feedback and they

play29:44

had changes that they want done to them

play29:46

and as soon as we started to collect

play29:48

those changes we brought it to the

play29:49

vendor it was a change request and it

play29:51

was another cost um and uh when we moved

play29:55

to the in-house model we were able to

play29:56

use an agile format so every two weeks

play29:59

we were just deploying a new build to

play30:01

the end user getting their feedback um

play30:03

and really owning that code base and

play30:05

being able to to make any changes that

play30:07

we needed um without you know having to

play30:09

go through a third party essentially

play30:11

every

play30:14

time all right that wraps up our session

play30:16

please give them a hand thank

play30:21

you and that's our last session for this

play30:23

room for today so you guys have a good

play30:26

evening thank you thank you very much

Rate This
โ˜…
โ˜…
โ˜…
โ˜…
โ˜…

5.0 / 5 (0 votes)

Related Tags
AI IntegrationAugmented RealityManufacturing TechXR TrainingSmart FactoryPredictive ModelsUser ExperienceContent CreationData OptimizationTech Innovation