FSE100 - Analysis
Summary
TLDRThis lecture delves into engineering analysis, defining it as a set of techniques to evaluate and understand system behavior. It distinguishes between model verification, ensuring a model meets specifications, and validation, confirming it aligns with customer expectations. The talk covers various analysis tools, including simulation, retrospective studies, and statistical methods, emphasizing their importance in iterative engineering design to refine models and meet requirements.
Takeaways
- đ Analysis in engineering design involves breaking down an object or system to understand its basic building blocks and their relationships.
- đ Model verification is about ensuring the model behaves as expected by the engineers, focusing on internal mechanisms and specifications.
- đ Model validation checks if the model meets the customer's expectations and performs as the system is supposed to, considering external inputs and outputs.
- đ§ The iterative aspect of analysis is crucial for identifying and fixing issues in models, leading to improved and more accurate representations of the system.
- đ ïž Various tools and techniques are available for analysis, including simulation, retrospective studies, statistical methods, and mathematical principles like calculus and linear algebra.
- đĄ The goal of analysis is to gain insights into the system's behavior, evaluate the model's quality, and ensure it meets both internal and external expectations.
- đ The iterative process of engineering design involves cycling between modeling and analysis to produce a valid and verified solution that meets all requirements.
- đ§âđ» In software engineering, prototyping is a common technique to create a simplified version of the product for customer feedback and internal evaluation.
- đŹ Statistical techniques are particularly useful for handling variability, a common challenge in engineering that can cause issues if not properly managed.
- đ There's a close relationship between modeling techniques and analysis, with the latter often informing and improving the former throughout the engineering process.
Q & A
What is the primary focus of the lecture?
-The lecture primarily focuses on the analysis from the perspective of engineering design, emphasizing the definition of analysis in engineering, the differences between model verification and validation, and the tools available for analysis.
How does engineering define analysis?
-In engineering, analysis is defined as a set of techniques or a collection of tools used to evaluate a model or system, leading to an understanding of how the model or system behaves, allowing for better interaction with the environment and making changes to these models.
What is the difference between model verification and model validation?
-Model verification is the process of ensuring that the model behaves as expected according to the internal specifications set by the engineers, while model validation is about ensuring that the model conforms to what the system is supposed to be doing and meets the customer's expectations.
Why is it important to perform both model verification and model validation?
-Both model verification and model validation are important because they ensure that the model not only meets the internal specifications (verification) but also aligns with the external expectations and requirements of the customers (validation).
What is the iterative aspect of analysis within engineering design?
-The iterative aspect of analysis within engineering design refers to the continuous process of improving the model by identifying issues through verification and validation, making improvements, and then re-analyzing the updated model to ensure it is valid and verified.
What are some common tools used for analysis in engineering?
-Common tools for analysis in engineering include simulation, retrospective studies, statistical techniques, basic mathematical principles like calculus and linear algebra, and software prototyping.
How does simulation contribute to the analysis process?
-Simulation contributes to the analysis process by allowing engineers to create a simulated environment to test 'what if' scenarios and evaluate whether the model performs as expected or as the customers expect it to.
What is the purpose of using retrospective studies in analysis?
-The purpose of using retrospective studies in analysis is to collect past data, apply it to the current model, and evaluate how well the model performs relative to historical outcomes, with the goal of improving the system.
Why are statistical techniques important in engineering analysis?
-Statistical techniques are important in engineering analysis because they help classify, identify, and manipulate variability, which can cause problems in engineering. By understanding and controlling variability, engineers can improve their models.
How do basic mathematical principles aid in engineering analysis?
-Basic mathematical principles such as calculus, differential equations, linear algebra, and real analysis aid in engineering analysis by providing the necessary tools to find optimal points, solve systems of equations, and perform other mathematical operations that are crucial for model evaluation and improvement.
What is the role of software prototyping in the analysis process?
-Software prototyping plays a role in the analysis process by allowing engineers to create a simplified version of the product that can be tested and evaluated by customers. This helps in gaining insights into whether the prototype meets expectations and functions as intended, guiding further improvements.
Why is the iterative nature of engineering important for analysis?
-The iterative nature of engineering is important for analysis because it allows for continuous improvement of the model through cycles of modeling, analysis, identification of issues, and refinement. This iterative process ensures that the final solution is bug-free, meets customer requirements, and is both valid and verified.
Outlines
đ Introduction to Engineering Analysis
This paragraph introduces the concept of analysis in the context of engineering design. It defines analysis from an engineering perspective as breaking down an object or system to understand its basic building blocks and their relationships. The lecture aims to differentiate between model verification and validation, discuss available analysis tools, and emphasize the iterative nature of analysis in engineering design. The importance of gaining insights into a system's behavior and evaluating the quality of models is highlighted, with a focus on ensuring models conform to system expectations and customer needs.
đ Tools and Techniques in Engineering Analysis
The second paragraph delves into various common analysis tools and techniques used in engineering. It discusses simulation, both computer-based and physical, as a method to test models in simulated environments. The paragraph also mentions retrospective studies for analyzing past data against current models, statistical techniques for handling variability, and the application of mathematical principles like calculus and linear algebra. Software prototyping is introduced as a way to create simplified versions of products for customer feedback. The paragraph concludes by emphasizing the iterative process of engineering design, where analysis is used to identify and improve issues in models to meet specifications and customer requirements.
Mindmap
Keywords
đĄAnalysis
đĄEngineering Design
đĄModel Verification
đĄModel Validation
đĄIterative Process
đĄSimulation
đĄRetrospective Study
đĄStatistical Techniques
đĄMathematical Principles
đĄSoftware Prototyping
đĄCustomer Requirements
Highlights
Analysis in engineering design involves breaking down an object or system to understand its basic building blocks and their relationships.
Wikipedia defines analysis as applying scientific principles to reveal properties and the state of a system.
Engineering analysis is a set of techniques and tools used to evaluate a model or system for better understanding and interaction.
The purpose of analysis in engineering is to gain insight into the system and evaluate the quality of the model.
Model validation checks if the model behaves as the system is supposed to, ensuring it meets customer expectations.
Model verification ensures the model meets the specifications set by the engineers during the design process.
The difference between validation and verification is that validation is external, focusing on input-output relationships, while verification is internal, focusing on mechanisms.
Analysis tools in engineering include simulation, retrospective studies, statistical techniques, and mathematical principles.
Simulation allows for testing 'what if' scenarios and evaluating model performance in a simulated environment.
Retrospective studies involve applying past data to current models to see how they perform relative to historical outcomes.
Statistical techniques help in classifying, identifying, and improving variability within a system.
Basic mathematical principles like calculus, differential equations, and linear algebra are crucial for engineering analysis.
Software prototyping is a common technique in software engineering, creating a simplified version of the product for customer feedback.
The iterative nature of engineering involves cycling between modeling and analysis to improve the model.
The goal of analysis is to produce a model that is valid, verified, and meets all customer requirements by the end of the engineering design process.
Analysis techniques are closely tied to modeling techniques, and understanding one improves the other.
The iterative process of engineering design allows for the identification and improvement of issues in the model.
Transcripts
so in this lecture we are going to talk
about analysis from the perspective of
the engineering design more specifically
we're first going to define analysis
from the perspective of engineering and
then we're going to look at the
differences between model verification
and model validation we're gonna look at
some tools that are available to us for
analysis and then we're going to focus
on the iterative aspect associated with
analysis within engineering design there
are many different types of analysis
that exist and depending on your
background your discipline may be where
you've worked in the past you're going
to hear different types of definitions
about analysis teach engineering talks
about analysis from the perspective of
breaking down an object dealing with the
system the problem and fundamentally
looking at it at its basic building
blocks to create the essential features
and their relationships to one another
Wikipedia which tends to simplify a lot
of definitions looks at it as an
application of our overall scientific
principles to reveal some properties and
the state of the system that our model
is actually trying to represent from
engineering we tend to look at it more
so as a set of techniques it's a
collection of the tools available to us
that we can use to evaluate a model or
system and it will eventually lead us to
some understanding and some better
insights about how the model or the
system behaves allowing us to better
interact with the environment and make
changes to these models the reason why
we tend to do analysis and engineering
is primarily to gain some insight about
the system and to evaluate the overall
quality of our model when we're gaining
insight what we're learning about is the
way the system is supposed to behaving
and we're checking to see whether
whether or not our model actually
comports to what the system is doing
this is generally called model
validation we need to make sure we are
gaining an understanding of the
individual building blocks of our system
so that our model represents what the
system is supposed to actually be doing
on the other side of analysis we can
actually evaluate the quality of our
model to make sure that our model is
behaving in such a way that it is
matching to what we have an expectation
of there is one side of the system that
is building the way it's supposed to
build and on the other side
we want to make sure that what we are
putting into the system is doing what we
expect it to do this is an opportunity
for us to find mistakes in our models
it's an opportunity for us to find small
little issues that we might not have
originally thought existed and so with
the evaluation side of things we can
begin constructing more complete models
that represent what we want it to do
while the validation side of it is
already taking to account what the
system is supposed to be doing these two
sections are very important from a
perspective of engineering analysis we
must make sure that the model is
conforming to what the system is
supposed to be doing another way of
looking at it is whether or not our
model is actually performing to meet
whatever the customers expect it to mean
so if you think about possibility of a
bank coming to you and asking for you to
redesign how they process checks from a
perspective of model validation we need
to make sure that our model actually
processes checks and that it functions
in a way that the bank expects it to
function the other side of this process
is the verification side this is where
we are trying to make sure that our
model does what we expect it to do we
have an internalized process of how to
process those checks we need to know how
the checks are going to be moving
through the system we need to know how
information is stored how the money is
managed how the money is moved and that
is the verification side of things we're
going to make sure that when we are
verifying a model that it meets the
specifications that we have put forward
as the engineers in the design
requirements the bank doesn't
necessarily care about how the checks
are processed they only care about that
they are processed this is the distinct
difference between model validation and
model verification model validation is
sometimes seen as an externalized
process where we're more worried about
input and its relationship to output or
as verification tends to be more
internalized or worried about the
mechanisms by which input is turned into
output that is why we have to look at
both of these from an analysis
standpoint because if we miss the
validation process then there's a good
chance we're not meeting our customers
needs if we miss the verification
process there's a chance our model will
not perform based on anything it might
just fall
the building blocks are bet more of this
will come up and we begin talking about
testing and implementation because the
testing and implementation sections are
centered around improving our models to
a point where they are valid and
verified while going through the entire
iterative process of engineering design
when it comes to the actual process of
analysis there are more tools than we
can probably talk about we have
different disciplines within CID Z
ourselves we have the different majors
outside of Susie we have all the other
engineering majors we need to make sure
that we're covering all different types
of analysis tools but there's just too
many to count
so I'm going to go over some of the more
common ones that you will see within our
department that we like to focus on
first simulation tends to be the most
common this is either through computer
techniques or through life techniques
the whole idea is that we can simulate
an environment we can test what if
situations and from all of that we can
actually evaluate whether or not our
model is doing what we expect it to do
or whether or not the model is doing
what our customers are expecting it to
do another known method is what is
called testing unknown data usually
called a retrospective study we go back
into the past we collect data we apply
that data to the current model we're
looking at and we see how the model
performs relative to what happened in
the past hopefully we have an
improvement the whole idea is to improve
a system we can also use statistical
techniques when we're dealing with
things like variability variability is a
difficult piece to deal with when
dealing with engineering because
variability has a tendency to cause
problems and in statistical techniques
we can generally classify the
variability identify it figure out ways
to manipulate it or to improve it
ultimately improving our model overall
we also can consider some of the basic
math principles that you guys have been
put together over time things like
calculus differential equations linear
algebra even higher levels of like
number theory real analysis they all
play a role in analysis techniques for
engineers if we need to find the maximum
point or the minimum point you're going
to have to use calculus if we need to
find whether or not a system of
equations has a solution you're looking
at linear algebra all of these different
tools that you've put together over the
years will eventually create a
well-rounded buck
of analysis techniques that you can use
at any point in time in the realm of
software a very common technique is
software prototyping it is generally
cumbersome to create a finalized product
in the middle of our engineering design
process so what we want to do is we want
to create a slimmed down version one
that doesn't have as much details bugs
and/or bells and whistles and so we want
to create this slim down version that
our customers can look at and to give us
some insight as to whether or not it's
working the way they expect to work and
whether or not you think internally it's
working in all reality there are many
many more we have a possibility of any
sort of consideration and it's generally
based on modeling the technique that you
use for analysis is tied to modeling
there's an intimate relationship between
the two of them so as you learn more
modeling techniques you will learn more
analysis techniques throughout the years
ultimately the goal of this is to
improve our model we're trying to
produce the best possible model by the
end of this engineering design process
so it is important that we take the
steps necessary to slowly improve our
model because we are gaining some
insight and some better understanding
about our analysis and about our system
we can actually improve our model the
only way we can actually improve it is
if we have some sort of mechanism of
identifying the issues this is the whole
point of analysis we can identify those
issues either system-wide or internally
through the verification and validation
processes and from that we can actually
determine whether or not our model meets
all the specifications the customers
have put forth meets our own
internalized specifications and doesn't
have any major bugs or any major issues
that we need to repair the goal once we
spot these issues is to improve on them
constructing a new advanced better model
and then we go back and we analyze it
again this is the iterative nature of
engineering and this is what allows us
to create solutions that are bug free
hopefully produce the solution that we
expect it to produce and meet all of our
customer requirements because of this
iterative nature there's a lot of
relationship going on between modeling
analysis and subsequently the other side
of simulation and prototyping but
fundamentally as we do
cycling between modeling analysis we can
produce a solid solution that is valid
and verified and meets all of the
requirements we expect it to me this is
the goal of analysis thank you for
watching
5.0 / 5 (0 votes)