Unleashing the Power of Gen AI and XR at Pfizer
Summary
TLDRThe video script details Fizer's journey in integrating XR and AI technologies to revolutionize manufacturing. George Hanaras introduces the Smart Factory team's efforts to enhance efficiency and reduce human error by leveraging AR and AI. Nicholas Hawley discusses the development of XR training to virtualize the manufacturing shop floor, improving accessibility and retention. San Sharma highlights the technical stack and AI's role in content creation and spatial awareness. The team shares insights on strategic collaborations, user adoption, and the importance of data standardization for effective AI integration.
Takeaways
- 😀 George Hanaras from Fizer's Smart Factory team discussed leveraging XR and AI technologies to improve manufacturing efficiency and reduce human error.
- 🛠️ The team identified pain points in manufacturing, such as complex procedures and environments, the need for guidance for operators, and issues with siloed systems.
- 🤖 They explored AR as a solution, experimenting with various hardware and software options to create immersive experiences that could be deployed in a regulated environment.
- 🔧 The decision was made to opt for an in-house development model, utilizing the Unity 3D engine and forming agile teams to develop and deploy AR solutions.
- 🔗 Emphasis was placed on integrating systems and data to bring the right information to the right user at the right time, standardizing and democratizing data for immersive experiences.
- 📚 The Smart Factory team recognized the need for digitizing tacit knowledge and providing end-to-end visibility of processes and tasks for operators.
- 🏆 The team has won awards for their work and is now exploring the integration of AI with AR, using predictive models and large language models to enhance manufacturing operations.
- 🎓 Nicholas Hawley highlighted the use of XR for training, emphasizing the importance of extending the reality of manufacturing assets to people for better training outcomes.
- 🌐 The Fizer Verse platform was introduced as a centralized hub for training content, enabling multi-user experiences and real-time interaction with trainers in virtual environments.
- 🛑 The journey of implementing XR in training involved starting with research, building proof of concepts, and eventually bringing development in-house for cost efficiency and agility.
- 🔮 Looking forward, the team is exploring AI in XR for asset optimization, generative voice and avatars, and the potential for AI to improve the user experience in manufacturing.
Q & A
Who is George Hanaras and what is his role in the Smart Factory team?
-George Hanaras is a member of the Smart Factory team at Fizer's diesel manufacturing organization. He is involved in leveraging innovative technologies like XR (Extended Reality) and AI to support manufacturing plants and improve front-line efficiency.
What is the primary goal of the Smart Factory team in terms of technology implementation?
-The primary goal of the Smart Factory team is to utilize innovative technologies such as XR and AI to support manufacturing plants, reduce human error, and enhance front-line productivity.
What challenges did Fizer's manufacturing plants face that led them to explore AR technology?
-The challenges included complex procedures and environments, the need for operators to navigate multiple siloed systems for information, unpredicted issues without clear root causes or corrective actions, and a significant amount of tacit knowledge not being digitized.
How did the Smart Factory team approach the integration of AR technology in their operations?
-They started by researching the market, identifying AR as a potential solution, and experimenting with various hardware and software components. They focused on high-value use cases and received positive feedback from early adopters, leading to the establishment of an in-house development model.
What is the significance of Unity 3D engine in the Smart Factory team's development process?
-The Unity 3D engine is at the center of their internal delivery pipeline, allowing the team to quickly develop prototypes and deploy AR solutions in a compliant manner.
How does the Smart Factory team ensure that the AR solutions are aligned with business requirements?
-They have formed internal agile teams consisting of 3D designers, artists, and developers who can convert business requirements into end-to-end AR experiences and deploy them compliantly.
What is the role of AI in the Smart Factory team's current initiatives?
-AI is used to train predictive models to find correlations in historical manufacturing data, allowing for instant notifications of deviations during new production runs. Additionally, AI is used to generate targeted action points and convert textual output into AR experiences.
What is the purpose of the 'connected worker interfaces' developed by the Smart Factory team?
-The connected worker interfaces aim to bring multiple systems together in a single pane of glass, enabling end-to-end visibility of processes and the visibility of the next critical task for the user, enhancing efficiency and reducing the need for manual data navigation.
How does the Smart Factory team address the issue of wearables not being mature enough for full shift adoption?
-They decided to use mobile devices with tablets, specifically, mounted on trolleys or movable carts, allowing operators to perform complex tasks hands-free when needed while still benefiting from the AR element.
What is the vision of Fizer for the convergence of XR and AI in the future of manufacturing?
-Fizer envisions XR to be the interface for AI on the shop floor, collecting data from various software processes into a single data channel, which then feeds into an AI system to make data smarter and more adaptive, aiding operators in making better decisions.
What are some of the technical considerations for deploying AR and VR applications in manufacturing as discussed by San Sharma?
-The technical considerations include content creation from CAD drawings and 360° scans, spatial awareness setup using image tracking or plane detection, development platforms like Unity and xcode, SDK APIs like AR kit and VR interaction toolkit, and networking libraries for multiplayer experiences. Additionally, there is a focus on data integration from IoT sensors and the use of XR content management systems for asset distribution and access.
Outlines
🤖 Introduction to Smart Factory and AR Journey
George Hanaras introduces the Smart Factory team from Fizer's manufacturing organization, emphasizing their goal to integrate innovative technologies like XR and AI to enhance manufacturing plant efficiency and reduce human error. The team's mission is to address pain points identified by customers, especially during the COVID-19 pandemic, when scaling up production rapidly became necessary. They explored AR as a solution to complex procedures, siloed systems, and tacit knowledge not digitized. After positive initial feedback, they opted for an in-house development model using Unity 3D engine for rapid prototyping and deployment, focusing on high-value use cases and collaborating with third parties when beneficial.
📲 AR Implementation and the Evolution of Wearables
The team at Fizer faced challenges with wearables' maturity for full-shift use, leading to a shift towards mobile devices and tablets. They developed connected worker interfaces to integrate multiple systems for end-to-end visibility of processes. The adoption of AI and AR together marked a new phase, where historical manufacturing data was used to train predictive models, enabling real-time issue notifications during production. The integration of AI with AR experiences was highlighted, focusing on adding value without creating noise for operators, particularly for complex tasks requiring guidance.
🛠️ XR Training and the Manufacturing Shop Floor Virtualization
Nicholas Hawley, a product manager for XR training at Fizer, discusses the importance of extending the reality of Fizer's most valuable assets: people and the manufacturing shop floor. The team addressed issues in traditional training, such as limited access to manufacturing areas and lengthy, hard-to-retain SOPs. They implemented a four-pronged approach involving virtual twins, VR training, 360 video training, and the Fizer Verse platform, which allows multi-user experiences for training. The journey began with research and proof of concepts, leading to the creation of a self-service model and, eventually, a centralized platform for easy access to training content.
🌐 Technical Stack and Future of XR Application Development
San Sharma, the tech lead for the Smart Factory team, provides an overview of the technical stack used for building XR applications, including content creation tools, spatial awareness technologies, development platforms like Unity, and SDKs for AR and VR. He discusses the importance of data integration from IoT sensors and the use of networking libraries for collaborative experiences. Sharma also highlights the potential of AI in content creation, spatial awareness, asset retrieval, and conversational interfaces, envisioning a future where XR serves as the interface for AI on the shop floor, with data collection and AI systems supporting better decision-making.
🏆 Award Achievement and Generative AI Integration
The discussion includes the team's recent win at the Augie Awards for the best use of AI, attributed to their effective use of predictive and generative models based on historical manufacturing data. The team explores the use of large language models within a highly regulated space, emphasizing the need for secure versions of these models. They detail how these models are used to query data and create an easier interface for visualization, highlighting the synergy between AI and AR in their applications.
🔄 Pivots and Learnings in XR Deployment and Development
The team reflects on their journey since 2018, discussing the challenges and lessons learned in deploying XR systems. Key takeaways include the importance of aligning with vendor roadmaps, the need for strategic collaborations, and the challenges of user adoption, particularly with AR on the shop floor. They share insights on the transition from using headsets to iPads for comfort and practicality, and the importance of flexibility in development approaches to accommodate changing requirements and feedback.
👏 Closing Remarks and Openness to Collaboration
In the final paragraph, the team concludes the session with an invitation for collaboration and feedback. They express gratitude for the audience's attention and participation, encourage questions, and highlight their openness to suggestions and partnership opportunities. The session wraps up with a round of applause for the presenters and a reminder that this was the last session for the day.
Mindmap
Keywords
💡Smart Factory
💡XR (Extended Reality)
💡AI (Artificial Intelligence)
💡Manufacturing Efficiency
💡Human Error
💡Predictive Models
💡IoT (Internet of Things)
💡3D Modeling
💡Agile Teams
💡AR (Augmented Reality)
💡VR (Virtual Reality)
💡Generative AI
Highlights
George Hanaras introduces the Smart Factory team from Fizer's manufacturing organization, focusing on leveraging XR and AI to improve manufacturing efficiency.
The team addresses pain points from manufacturing plants, especially during the COVID-19 pandemic, where complex procedures and environments were challenging for operators.
Smart Factory explored AR as a solution for the identified issues, adopting a startup mentality to quickly test new technologies.
Hardware and software trials for AR included wearables, mobile devices, tablets, headsets, QR codes, image targets, 3D scans, and cloud anchors.
In-house development was chosen for AR solutions due to complex customer requirements and a highly regulated space.
Unity 3D engine is central to the internal delivery pipeline for developing and deploying AR solutions.
Data standardization and democratization are key to bringing the right information to users in AR experiences.
The Connected Worker interface aims to integrate multiple systems for end-to-end visibility of processes and tasks.
Smart glasses were deemed immature for full shift use, leading to a preference for mobile devices and tablets on movable carts.
AI and AR integration is the current phase, using predictive models to identify deviations from historical patterns in manufacturing.
Large language models (LLMs) are deployed to scan through SOPs and work instructions, generating targeted action points for users.
Textual output from AI is being converted into AR experiences for seamless transition from 2D to immersive guidance.
Nick Hawley discusses the XR training journey at Fizer, emphasizing the importance of extending reality for valuable assets like people.
XR training aims to solve issues like limited access to manufacturing areas and lengthy, hard-to-retain SOPs.
Virtual twins, 3D scanning, and custom VR training are part of the multi-pronged approach to XR training at Fizer.
Fizer Verse is a centralized platform for hosting and accessing all XR training content, enabling multi-user experiences.
San Sharma, the tech lead, outlines the technical stack used for building XR applications, including content creation and development platforms.
AI is seen as crucial for improving content creation, spatial awareness, asset retrieval, and conversational interfaces in XR applications.
Fizer envisions XR as the interface for AI on the shop floor, with a focus on data collection, AI integration, and operator guidance.
The team discusses lessons learned, such as the importance of strategic collaborations and user adoption, as well as the shift from AR headsets to iPads for comfort.
The decision to bring development in-house was driven by the need for agility, cost efficiency, and the ability to quickly iterate based on user feedback.
Transcripts
hello everyone um let's just wait for a
few people to sit I see they're coming
now so hello everyone I'm really happy
to be here I hope uh everyone enjoyed
the show so far and I want to thank you
for staying until the end to share a
speak um my name is George hanaras and
uh I'm part of uh d team called smart
Factory led by Ron Kelly H we're part of
fizer's diesel manufacturing
organization and our overall goal is to
leverage Innovative technology like XR
and AI to support our manufacturing
plants and to enable our front line to
be more efficient by reducing human
error and improving their over
productivity uh for today's uh
presentation we want to put ourselves in
the place of the audience and think of
what would be valuable to share when
we're trying to implement Innovative
technology in manufacturing what are
some of the roadblocks what are some of
the best practices we can follow to be
successful uh so with that I will start
with uh our air Journey so far and uh
I'll take it from the beginning uh we
had our customers our manufacturing
plants coming to us with some big pain
points that they had
identified especially during Co when we
were trying to scale up at really high
Pace uh to meet our Global uh market
demand we're seeing that our comp that
our procedures and also our environments
were becoming more and more complex and
our operators needed that extra layer of
guidance to perform their tasks we're
also seeing that uh usually they needed
to navigate between multiple silot
systems to get the information they
needed in order to make uh their
decisions in addition sometimes there
were issues that we could not predict
and when those issues did occur we
didn't have a clear root cause or a
clear corrective course of
action finally we saw there was lot a
lot of tacd knowledge that was not
digitized and was living with our more
experienced
smmes so what we did as smart Factory we
uh researched the market and we
identified augment reality as a
potential technology that can solve this
issues we always have this startup
mentality want to quickly try out the
latest and greatest Tech and see if we
can get value out of them so we
approached they from both hardware and
software on the hardware side we triled
out um a lot of devices wearables uh
mobile uh devices tablets and headsets
and on the software side uh we tried out
the different components that can enable
this High experiences like uh QR code
image targets uh 3D scans and Cloud
anchors we initially focused on the more
uh high value use cases and uh we're
happy to see that the initial feedback
from our uh early adopters was really
positive and uh the demand was adding
up so in order to meet uh the complex uh
requirements of our customers and uh
we're in a space where it's uh really uh
highly regulated we have some additional
restrictions what made sense for us is
to opt for an in-house uh development
model uh so we set up our own internal
uh delivery pipeline uh at the focus of
it we have uh the unity 3D engine that
allows us to quickly develop prototype
and deploy Air
Solutions we formed our own internal uh
agile teams consisting of uh 3D
designers artists and developers that
were able to take the business
requirements and convert them to end to
end air experiences and then deploy them
in a compliant
way uh we don't want want to reinvent
the wheel whenever it makes sense we
also partner up with third parties and
we're trying to also use existing
solutions to fill the gaps in our
processes our next Milestone uh was
about uh systems and data so a basic
aspect of augment reality is the ability
to bring uh the right information to the
right user to the right location so what
we did there is work closely with our uh
supporting uh teams within digital
Manufacturing
and uh we had a goal to uh standardize
democratize and catalog all of our data
and then add this speciaal uh context to
them so as a result we can bring them
into our immersive experiences we also
put a lot of focus in our um user
Journey research and our Persona mapping
so we came up with these connected
worker interfaces the main concept of it
is to be able to bring multiple uh
systems together in a single pane of
glass and enable that end to end
visibility of processes and also the
visibility of the next critical task for
its
Persona we also came to realize that the
wearables the smart glasses were not
that mature yet to be fully adopted and
um they could not be used for a full
shift so that's why we decided to go
with mobile devices with tablets
specifically and as you can see we opted
for this setups that we can have maybe a
trolley or a movable cart on the floor
um the iPad is set up on top of it and
we can have we can still have this Air
element uh but our operators can also
perform the more complex tasks with
their handsfree when
needed so this takes us uh to today uh
after a lot of trying and a couple of O
Awards later we are in a phase where we
are using Ai and AR together so the
first step was to have our data in the
right format and then we were able to
train our pred predictive models our
predictive AI models and find
correlation between all of our
historical manufacturing data so what we
are doing now is H during a new
production during a new campaign or a BS
we can compare to those historical uh
patterns and if we see that we're
deviating from them we can instantly
notify our users of an
issue then we're bringing gen into the
mix so we're deploying LM models that
can quickly scan through our large know
base of uh Sops work instructions uh
user logs and generate targeted action
points for our users we are then taking
it to a step further we're converting
this textual output uh to an AR
experience so we're enabling this
simless transition from the 2D interface
to the more immersive location based uh
experience something to note here is we
don't want to add AR everywhere but only
when it makes sense um so uh we don't
add extra noise to The Operators only
when there's those more complex tasks
that require this extra level of
guidance then we jump into their
layer
um we're still in baby steps here on
this field uh as we believe the whole
industry but we're already excited to
see some value coming out of it uh we
were really happy to share also during
the keynote and throughout the week that
uh there are many that believe that AR
will be the the interface to Ai and uh
this for sure uh needs the hardware to
evolve in the same way so we're hoping
to see maybe wearables in the future
enabl this uh in a better way but uh for
us uh it will still be a big part of our
future road map and we're excited for it
so I would like to pass it over now to
Nick to talk about our XR training space
thank you
everyone thanks George hey everybody I
am uh Nicholas Hawley and I'm a product
manager for our uh XR training at fizer
U just want to say thank you to everyone
for being here sticking it out to the
end you know what they say they say you
save the best for last or maybe in our
case the last one is the request for a
speaker spot but um here we are um but
uh yeah so XR training um you know
before I get into some of the specifics
around it I'd like to say you know
around the mission and um you know think
the one thing that XR opens up for for
fizer and um uh for manufacturing as a
whole is really extending the reality
like literally extending the reality of
our most valuable assets and those are
people and that's our manufacturing shop
floor our manufacturing shop floor is
our lifeblood of our operations um and
if we don't have that then we have no
fiser and if we don't have people we
have no fiser so what we've been doing
in the extended reality training space
is is actually bringing and and
virtualizing our manufacturing shop
floor and bringing it to the people that
need the information to perform their
jobs and uh their processes on the line
um so some of the problems that we saw
around the training space before XR was
uh limited access to our manufacturing
areas um the the uh requirement to have
multiple repetitions on the physical
equipment that's very hard to access on
the training lines um lengthy Sops 100
plus pages long um that really uh didn't
have a lot of retention when people were
going through it um and human error that
can cause production
delays um so some of the ways that we
actually approach this to uh to solve
this um it's kind of a four-pronged
approach um is our virtual twins um
which you can see in the video right
here where um we actually started 3D
scanning all of our manufacturing
environments and really that's our
backbone to uh um to all of our uh VR
training that we build as well um so
this is for as George mentioned our
anchoring for our AR and also we use to
actually develop our custom 3D assets on
the manufacturing L as well and this
enables our custom VR training which you
can see here where we actually
completely virtualizing and creating
immersive interactive VR training spaces
for our our shop floor uh workers um
they're able to interact with objects uh
within the space and actually go through
step by step the the um trainings that
they need to carry
out um another area that we're expanding
into is um the is 360 video training as
well um so you know depending on the use
case and the scenario it might make more
sense to actually leverage 360 video
like you can see here um instead of a
fully immersive interactive um training
experience and there's also a cost
differentiator there as well um and and
lastly the the kind of the backbone
behind all of that is our fiser verse
platform which hosts all of this content
makes it able for you makes it makes it
uh make enables you to actually create
multi-user experiences where a GMP
qualified trainer can join the same
space as uh a trainee um and actually
have real-time dialogue over the virtual
equipment and kind of even quiz them and
also ask them questions um so they can
actually understand if they uh if they
know the procedure that they're going
through before they even can step foot
on the line um so so really uh you know
there's multiple pronged approach to
really extending the reality right we're
this is we're extending reality here and
um if we can create take our assets you
know the production floor and extend
that to the people that needed at the
time they needed um we unlocking a lot
of value that
way um but to get there and to get even
just the clips that you saw there it was
a journey much like the the AR Journey
as well um in 2018 starting with a lot
of research we we were looking across
the market understanding the immersive
technology that was available
identifying use cases to understand
really how um XR VR Etc was going to
make an impact on on what we were doing
um within digital
manufacturing um in 2019 we started to
partner with a few vendors and we built
some proof of Concepts in the VR
training um in the 3D scanning space um
really just to kind of understand
conceptually and have something that we
can internally Market to get in front of
the right folks so they could see
conceptually how this could be applied
in in
manufacturing in 2020 um obviously it
was uh the year that covid hit so there
was um a lot of uh demand to move to
more virtual learning formats um and
actually in 2020 some of those proof of
Concepts that we built with the 3D
scanning technology we were able to show
the right people in in uh in the B at in
the business at fiser and they wanted to
adopt the the early days of that virtual
twin technology that you saw and now
we're at 16 sites globally with over
2,000 scans um so this we actually
created a self-service model where we
were able to actually send the scanners
to the sites the sites were able to
replicate these areas and show the
people um on the actual production that
worked on the production floor and
externally what was happening um inside
of these areas so this is really kind of
the the first moment where we enabled
desktop XR I guess you could say um more
people were experiencing this extended
reality of a production floor um on
their
desktop and in 2021 much like the AR
Journey this is where we started to
bring our development inhouse because
one of the things we realized in 2019
with the vendor bills is there was a lot
of ongoing cost when trainings were
changing um so there was a a lot of cost
efficiency with actually um uh building
them internally and managing the code
base um moving forward so the this is
where we started to build some of those
custom VR trainings you're seeing and
and really uh reducing uh the cost of
the development and also reduce seeing
some of that value um around the
reduction of training time reduction in
human error um for uh the trainings that
we were
building in 2022 um really was a focus
on centralization um you know we started
to have these different uh virtualized
manufacturing environments that we had
training in we have the the the code
that we we own and that um we're we're
able to iterate and build more and more
trainings on we're building an asset
base um to be able to enable these
trainings um but we didn't have a
centralized way for people to start
accessing them we also didn't have a way
to enable that multi-user experience so
this is when we created Fiserv verse um
so Fiserv verse is like a you know
essentially like a a menu Netflix
Channel where you could go in and you
see all the trainings for a certain area
that we've built all the different areas
are there you can join there as a free
roam or you can actually go into the
training experience with your trainer um
to uh to go through the training um uh
together or individually so really that
was taking all the data that we've
collected so far from the 3D enablement
and the in-house design and centralizing
that an easy to use centralized
location and lastly that leads us to the
2023 in the present where we're starting
to actively explore XR and AI um so um
you know from the VR perspective we're
really seeing uh asset optimization
being a really big uh player in the XR
and AI space in terms of optimizing our
3D asset uh pipeline um to be able to
generate um textures and objects and
things like that to reduce the amount of
time it takes to actually build
environments and the the objects that
we're going to be using in the spaces
feeding it reference image uh feeding a
generative AI reference images so it can
actually create the the objects that we
need um generative voice so actually
having the ability to speak with
somebody um within a model that's
trained to actually be an expert on the
trainings in the space when they go in
um to actually reduce the need for
trainer interactions in there um and
then also a generative Avatar so this is
you know the full embodied AI Avatar
that can take you through a training and
you can have a real-time dialogue and
conversation with so these are the areas
that I think from an XR perspective are
really excited to to explore in the
future with uh with XR and AI thank you
than you I will be passing it off to
sain uh thanks Nick uh good afternoon
everyone so I'm the cool techy guy
here so hi hello everyone my name is San
Sharma and I the tech lead for the smart
Factory team and personally I'm very
excited to talk about the interesting
work which we are doing uh in fer uh in
terms of XR and AI right so firstly uh I
would like to give a quick overview of
the current technical stack which we use
for building XR application in fiser so
basically we start with the content
creation where we take the cad drawings
the 360° scans from mport feed them into
a 3D modeling tool like as Maya blender
3D Max and create the assets and
animations and then if you're building
an AR application so you probably need
to do a spatial awareness setup so
depending on the environment uh and the
use case we kind of use image tracking
uh plane detection Cloud anchors area
targets right um and in terms of
development platform we use Unity
because it's a more diverse tool and
easy to use and we also use xcode for
building our native uh UI uh um Apple
applications basically in terms of SDK
apis uh it depends on the requirements
so if you're building an AR application
we use AR kit AR core voria and if we
are building VR applications we are
using XR inter interaction toolkit and
as Nick mentioned right we are building
collaborative experiences so we probably
end up using a networking libraries as
Photon uh for the multiplayer stuff and
uh yeah we also use some Cloud streaming
uh for rendering 3D assets at runtime
actually then that then that comes the
data part that's a more key part so we
kind of use the uh the data on the shop
flows from the iot sensors and integrate
into our uh exer applications using
microservices API and connectors and
once that's all uh done we kind of
package it and we test out the
application using test flight appls and
once the validation teams done the
testing part uh then it's packaged and
deploy into production using MDM such as
quest for business we have a good
partnership with meta so yeah thanks to
meta for that and uh yeah and one last
uh key important piece is the XR content
management system so you need a probably
a 3D CMS to uh distribute manage and
provide access to the 3D assets which we
have created because we have lots of
trainings lots of air application so we
need a 3D CMS right so here what I
really wanted to talk you about is how
can we improve the building blocks here
right and I see AI playing a crucial
role here so for example for the content
creation part right uh imagine you can
uh create uh the 3D models on the Fly
Right U using automated 3D modeling
right what if you can create a v VR
environment uh using a simple text or
using uh existing images right wouldn't
that be great uh we are already already
doing that and uh you can create
textures for your 3D assets that um so
that that's a path we want to go in
terms of content creation uh spatial
awareness like right when we are doing
the proof of Concepts when we're
building the AR applications so we
observe that in terms of tracking right
uh the reflection the lighting plays an
important role so when we go to the
actual environment the tracking is is
not as good what we expected to be and
that's where we we think AI could play a
crucial role you know helping us
understand the environment where we are
and you know improve the tracking
process basically U we could also use AI
for uh quick retrieval of assets tagging
them um creation of personalized avas
know having powerful conversation
interfaces right so it's the limit uh
the the limit is and endless here using
a um um Ai and XR
so what's next for us is uh in fisa we
have a vision that uh we want uh uh XR
to be the interface for AI so on a
shoplow right so we have this industrial
uh uh m4o architecture where we want to
collect all the data from different
software processes feed them into like
one single uh data Channel and from that
feed they feed into an AI genni system
uh which can make our data smarter
productive more adaptive R and it could
be more helpful for our sh flows
operators to make better decisions on
the Shu flow and also we don't want to
uh push XR we want them to use as a
wherever they want to use it so for
example if if the operator requires a
location based guidance we can probably
add AR there we can guide them and or
probably overlay some AR instructions
off top them on top of that right so
yeah uh in fisa we really believe that
XI and AI will converge
and uh it it will play a crucial role uh
shaping up the manufacturing shlow so
yeah uh if you if you like our story if
you want to be part of us and if you if
you want to help us improve this
building blocks of XR applications we
are open to collaborations and yeah
thank you so much and feel free to reach
out to us we are we are staying after
the talk here also so if you have any
suggestions any collaboration
opportunities we would love to hear so
yeah thank you so much and we are open
to taking any questions uh you have
awesome just raise your hand if you'd
like to ask a question all
right um you won an Augie yesterday
didn't you congratulations on that I
wanted to know what you thought were the
elements of your build that actually won
you the
Yogi so this was for the best use of AI
if I'm not wrong and this uh I think
it's related to what uh we shared with
you today uh I think as I mentioned
we're still in early stage there with
Gen uh I think the boom happened
sometime middle next year last year so
everyone is trying to find a use case
where we can actually add value through
jni uh for us uh as we said we have this
uh we are lucky actually to have this
vast knowledge base of data of
manufacturing historical data that we
can take advantage from so as we say you
can be better if you really know your
past and this translates to uh our
predictive and generative models so we
learn from our mistakes from our uh past
process and we can use that to make our
new production runs better and this has
a direct impact in the product we are
delivering so I think this is why uh
maybe this award came to us because
we're really making an impact with AI
and
AR um can you talk a little bit more
about what the user experience is in in
using the product like how exactly is
generative AI being
used okay so we are using those large
language models and uh as I mentioned
earlier we're working in a highly
regulated space uh we need to have a
secure versions of those models So
currently within fiser we have deployed
our inter our private instances of uh
those uh large language models and we
use them to query our data so
essentially talk to our data and air
comes into place with creating that
easier interface to talk to the data and
then also visualize the result of that
so this we see where we see the sync
between the
two thank
you thank you for sharing your journey
and your experience with us can you give
us some examples of the scales of your
deployment like how many devices how
many people you put through the training
and also uh obviously content creation
and maintenance is especially in 3D is
even more expensive than n videoid have
you considered using AI to accelerate or
optimize that thank
you yeah I'd say for the the content
creation it's something that we've
already been actively exploring I think
s mentioned on the the asset
optimization and our pipeline of how
we're creating our assets it's something
that we're trying to integrate gen geni
with so we can reduce those uh uh
development times for for our assets um
and in terms of our scale um you know we
uh uh about 40ish manufacturing sites
across our our Network and we have XR
trading at about 20 of those so it's
about half the network um that's uh
that's leveraging some form of extended
reality
training hi uh thanks for the talk I
would like to know about the layers of
abstractions that you have to do
to deploy this system system like what
comes first uh the llm and everything is
bested on llm
or what are the other things just has to
go parallel to each other to for for a
better deployment of the
system so the llm is a part of the
overall uh product uh what I mentioned
there is that it's really important to
have the data in the right format so in
order for an a model to work it's as
good as your data so really big big
focus is put there to standardize our
data catalog and add this extra content
context to them so specifically for AR
add this uh um speciaal uh context to
our data so then they can be surfaced by
our um interfaces so this is the high
level
pipeline sorry I need a bit more uh one
more clarification can I ask I mean if
it does not violate the n
obviously should uh so the special
awareness comes from the llm or it's
completely
separate separate okay thank
you any other
questions uh hi thank you so much for
your presentation um so your journey
with all this started back in roughly
2018 um so over what six seven years of
time to build this entire pipeline in
this Mass undertaking um I'm sure that
you've had plenty of pivots um pitfalls
you know things that you've had to learn
over the uh over the course um what
would you say in your opinion would be
some of the biggest lessons or key
takeaways that you've taken from having
to make any changes like that over
time maybe something on my side uh
usually when working with with vendors
uh we see there are multiple vendors
that are supporting uh the Pharma
industry or the manufacturing industry
we need to make sure that uh they um
road maps align with our so we' seen a
few times that um some of our partners
might change directions maybe they have
multiple clients so we might not be
their first priority so it's really
important to make these strategic uh
collaborations and be sync with the
people you're partnering with I think
this is something important and we had a
few successes and maybe a few failures
there but uh yeah we learned our lesson
personally from my end I think it was
the adoption part
so like we are used to uh using headsets
and all those cool things right but the
operators on the shop flow they are not
used to it so uh that's where the main
challenge was so uh for VR it was good
like they were they were ready to use it
but like in terms of AR on the shop flow
using a hollow lens it's a very bulky
device right so that's where we did a
shift change we started using iPads so
they were like more comfortable to use
iPads so yeah and I think still it will
grow like once the technology is much
better once the form factor is reduced I
think they will realize the true
potential of it and they will be uh okay
to start using uh the headsets and
everything but yeah think you want to
add anything
sure yeah I was just going to say you
know flexibility um is definitely a big
one too because I think when we we
brought the development in house like we
we figured it all out right but you know
the funding clim change and being able
to offer other lower cost Solutions like
360 video training and diversifying the
portfolio and the different options that
you have for the business depending on
the climate is is important as
well we got time for one more oh there
we
go thank you so much um for your uh
training development Journey uh what
were the considerations that made you
switch from from external vendor to
Bringing everything in
house yeah I think the first one um you
know some of those initial proof of
Concepts that we rolled out and we
deployed um you know once the the end
users started to get uh their hands on
them they had a lot of feedback and they
had changes that they want done to them
and as soon as we started to collect
those changes we brought it to the
vendor it was a change request and it
was another cost um and uh when we moved
to the in-house model we were able to
use an agile format so every two weeks
we were just deploying a new build to
the end user getting their feedback um
and really owning that code base and
being able to to make any changes that
we needed um without you know having to
go through a third party essentially
every
time all right that wraps up our session
please give them a hand thank
you and that's our last session for this
room for today so you guys have a good
evening thank you thank you very much
Посмотреть больше похожих видео
An explorer of extended reality in medtech
Embracing change in UX: insights from Jakob Nielsen
Top 10 Technologies To Learn In 2024 | Trending Technologies In 2024 | Edureka
تعرف على أقوى مساعد ذكي وثورة في عالم الذكاء الاصطناعي: Microsoft Copilot
The Latest Big XR News: Recap on Highlights from 2023
Generative AI in Marathi | Educational Talks with Prof. Milind Pande
5.0 / 5 (0 votes)