What is Sensor Calibration and Why is it Important?
Summary
TLDR传感器校准对于现代工艺流程工厂的运作至关重要,它能确保传感器准确测量流量、压力、温度等关键过程变量,从而让控制系统精确调整阀门、泵和其他执行器,保障工厂的安全高效运行。视频介绍了校准的概念、重要性以及如何通过“现状检查”和精确调整来最小化误差,强调了校准对于提高过程控制精度和保障过程安全的重要性。
Takeaways
- 🔍 传感器校准对于现代工艺流程工厂的运行至关重要,它能确保传感器准确地测量流量、水平、压力和温度等重要工艺变量。
- 📈 校准是通过一系列调整,使仪器或设备尽可能准确无误地运行,减少误差,即测量值与实际值之间的代数差。
- 🛠️ 传感器误差可能由多种因素引起,包括设备零点参考不准确、环境条件变化导致的量程偏移,以及机械磨损或损坏。
- 🔧 校准过程包括对传感器进行“现状检查”,即在进行任何调整之前先进行一次校准,以确定是否需要重新校准。
- 📊 进行“现状检查”时,使用精确的仪器生成0%、25%、50%、75%和100%的过程范围信号,并记录相应的输出信号,这称为“五点检查”。
- 🔄 检查滞后现象(hysteresis)时,记录并比较传感器输出在上升和下降过程中的偏差,以确保传感器的准确性。
- 🎯 如果测量偏差超过允许的最大偏差,就需要进行完整的校准。如果偏差在允许范围内,则不需要校准。
- 🔩 校准时需要使用非常精确的过程模拟器,如压力源,并使用电流表测量传感器的4-20 mA输出。
- 🔄 校准模拟传输器时,需要调整零点和量程,以减少测量误差。模拟传输器的零点和量程调整是相互影响的,因此校准是一个迭代过程。
- 📱 数字传输器允许通过调整模拟到数字转换器的输出(称为“传感器微调”)或数字到模拟转换器的输入(称为“4-20 mA微调”或“输出微调”)来校准。
- 🚀 校准后的误差会再次被图形化表示,以确保误差控制在规定的容忍范围内,从而确保工艺流程的高效和安全运行。
Q & A
现代工艺工厂中,工程师为何要指定传感器来测量过程变量?
-工程师指定传感器来测量如流量、液位、压力和温度等重要的过程变量,以便过程控制系统能够调整阀门、泵和其他执行器,以维持这些量的适当值,并确保安全运行。
传感器校准的目的是什么?
-传感器校准的目的是进行一系列的调整,使传感器或仪器尽可能准确地或无误差地运行,以确保传感的实际值能够被感知并传递给控制系统。
传感器测量误差可能由哪些因素引起?
-传感器测量误差可能由多种因素引起,包括仪器可能没有正确的零点参考、电子设备随时间漂移、传感器范围的偏移以及机械磨损或损坏等。
为什么控制系统需要准确的传感器数据?
-控制系统需要准确的传感器数据来做出正确的控制决策,如调整控制阀的输出或设置给料泵的速度,以保证过程的高效和安全运行。
传感器校准程序通常要求多久进行一次?
-现代工艺工厂的传感器校准程序通常要求仪器定期进行校准,尽管校准可能需要相当长的时间,特别是当设备难以到达或需要特殊工具时。
进行“现状”检查的目的是什么?
-进行“现状”检查的目的是在使用精确的仪器进行校准之前,对传感器进行一次校准,以确定当前的仪器校准是否在设备的公差范围内,从而决定是否需要重新校准。
“五点检查”是如何进行的?
-“五点检查”是通过使用精确的仪器产生对应于传感器0%、25%、50%、75%和100%过程范围的过程信号,然后观察并记录相应的传感器输出,以毫安为单位。
如何检查传感器的滞后现象?
-为了检查滞后现象,即传感器输出对于过程值在减小('downscale')时与增加('upscale')时不同的现象,需要记录对应于100%、75%、50%、25%和0%的输出信号,并计算每个检查点的偏差。
如果传感器的偏差大于允许的最大偏差,应该怎么办?
-如果传感器的偏差大于允许的最大偏差,则需要进行完整的校准。
在校准模拟传感器时,通常使用什么类型的设备?
-在校准模拟传感器时,通常使用非常精确的过程模拟器,如压力源,以及连接到过程侧的传感器,并使用电流表来测量传感器的4-20 mA输出。
数字传感器的校准如何进行?
-对于数字传感器,可以通过调整模拟到数字转换器的输出(称为“传感器微调”)和/或数字到模拟转换器的输入(称为“4-20 mA微调”或“输出微调”)来调整传入的传感器信号。
校准后,传感器的最大偏差应降低到多少?
-校准后,传感器的最大偏差应降低到规定的公差范围内,例如从0.38%降低到0.18%,确保在0.20%的公差范围内。
Outlines
🔧 传感器校准的重要性与过程
本段落介绍了传感器在校准过程中的关键作用,强调了传感器测量的重要性,如流量、液位、压力和温度等,以及这些测量如何帮助过程控制系统调整阀门、泵和其他执行器,以保持这些量的适当值并确保安全操作。视频将解释如何通过“传感器校准”来维持这些传感器的操作,以确保过程的实际值被感知并传递给控制系统。此外,还讨论了误差的概念、误差的成因、以及如何通过“现场发现”检查和五点检查来确定是否需要校准。
📊 校准结果分析与调整
这一部分讨论了如何分析校准结果,并根据最大偏差容限决定是否需要进行校准。通过使用压力供应器和电流表等精确设备,对传感器进行校准。对于模拟传感器,需要调整零点和量程以减少测量误差,而数字传感器则可以通过调整模拟到数字转换器输出或数字到模拟转换器输入来进行校准。校准后,误差会被重新绘制,以确保误差在容忍范围内。视频最后强调了传感器校准的重要性,以及它如何使过程控制更加准确,从而提高过程的效率和安全性。
Mindmap
Keywords
💡传感器校准
💡过程变量
💡误差
💡过程控制
💡执行器
💡零点参考
💡量程偏移
💡机械磨损
💡过程效率
💡传感器校准程序
💡过程安全
Highlights
工程师在设计现代工艺工厂时,会指定传感器来测量重要的工艺变量,如流量、液位、压力和温度。
这些测量用于帮助过程控制系统调整工厂中的阀门、泵和其他执行器,以维持这些量的适当值并确保安全运行。
工厂如何维持这些传感器的运行,以保证过程的实际值被感知并传递给控制系统?答案是“传感器校准”。
传感器校准是在一个传感器或仪器上执行的调整或一系列调整,以使仪器尽可能准确地或无误地运行。
误差是指示值与被测量变量的实际值之间的代数差。
传感器测量误差可能由多种因素引起,包括仪器可能没有正确的零点参考。
现代传感器和发射器是电子设备,参考电压或信号可能因温度、压力或环境条件的变化而随时间漂移。
传感器的量程可能因相同条件或工艺操作范围的变化而发生偏移。
传感器测量误差可能是由于机械磨损或损坏造成的,通常这种类型的误差需要修理或更换设备。
误差是不可取的,因为控制系统将没有准确的数据来做出控制决策,如调整控制阀的输出或设置给料泵的速度。
如果校准与准确的过程条件相差太远,过程安全可能会受到威胁。
作为工厂中的仪表或操作工程师,我需要每个仪表都有正确的校准。
适当的校准将产生准确的测量结果,这反过来又使得过程的良好控制成为可能。
大多数现代工艺工厂都有传感器校准程序,要求仪器定期校准。
为了最小化执行传感器校准所需的时间,首先对仪器进行“现状”检查。
如果当前仪器校准在设备的声明公差范围内,则不需要重新校准。
执行“现状”检查时,使用准确且精确的仪器来产生对应于发射器过程范围0%、25%、50%、75%和100%的过程信号。
这称为“5点”检查,然后为了检查滞后现象,记录对应于100%、75%、50%、25%和0%的输出信号。
在每个检查点的偏差被计算并与设备允许的最大偏差进行比较。
如果偏差大于允许的最大偏差,则执行全面校准。
如果偏差小于允许的最大偏差,则不需要传感器校准。
假设最大偏差公差为0.5%,使用校准图表中的数据,我们可以看到偏差都小于允许的最大偏差0.5%。
假设最大偏差公差为0.20%,使用校准图表中的数据,我们可以看到一些偏差大于允许的最大偏差0.20%。
为了校准,我们需要一个非常准确的工艺模拟器,例如压力供应,连接到发射器的过程侧。
校准后,再次绘制误差图。最大偏差已从0.38%降低到0.18%,远低于0.20%的公差。
在这个视频中,你了解了传感器校准的重要性,校准是使仪器尽可能准确或无误运行的调整或一系列调整。
适当的传感器校准将产生准确的测量结果,这反过来又使得过程的良好控制成为可能。
当实现良好控制时,过程就有最佳的机会高效且安全地运行。
Transcripts
When engineers design modern process plants,
they specify sensors to measure important process variables,
such as flow, level, pressure, and temperature.
These measurements are used to help the process control system adjust the valves,
pumps and other actuators in the plant
to maintain the proper values of these quantities and to insure safe operation.
So how does a plant maintain the operation of these sensors
to guarantee that the actual value of the process is sensed
and passed to the control system?
In this video, we will learn that the answer to that question is: “Sensor Calibration”.
before we get started on today's video
if you love our videos,
be sure to click the like button below.
Then make sure to click subscribe
and the little bell to receive notifications of new RealPars videos.
This way you never miss another one!
Sensor calibration is an adjustment or set of adjustments
performed on a sensor or instrument to make that instrument function
as accurately, or error free, as possible.
Error is simply the algebraic difference between the indication
and the actual value of the measured variable.
Errors in sensor measurement can be caused by many factors.
First, the instrument may not have a proper zero reference.
Modern sensors and transmitters are electronic devices,
and the reference voltage, or signal,
may drift over time due to temperature, pressure, or change in ambient conditions.
Second, the sensor’s range may shift
due to the same conditions just noted,
or perhaps the operating range of the process has changed.
For example, a process may currently operate in the range of 0 to 200 PSI,
but changes in operation will require it to run in the range of 0 to 500 PSI.
Third, error in sensor measurement
may occur because of mechanical wear, or damage.
Usually, this type of error will require repair or replacement of the device.
Errors are not desirable, since the control system will not have accurate data
from which to make control decisions,
such as adjusting the output of a control valve
or setting the speed of a feed pump.
If the calibration is too far from the accurate process conditions,
process safety may be jeopardized.
If I am an instrument or operations engineer in a plant,
I need every instrument to have a proper calibration.
Proper calibration will yield accurate measurements,
which in turn, makes good control of the process possible.
When good control is realized,
then the process has the best chance of running efficiently and safely.
Most modern process plants have sensor calibration programs,
which require instruments to be calibrated periodically.
Calibration can take a considerable period of time,
especially if the device is hard to reach or requires special tools.
In order to minimize the amount of time
that it takes to perform a sensor calibration,
I would first do an “as found” check on the instrument.
This is simply performing a calibration prior to making any adjustments.
If the current instrument calibration
is found to be within the stated tolerance for the device,
then re-calibration is not required.
To perform an “As-Found” check,
an accurate and precise instrument is used
to develop process signals corresponding to 0%, 25%,
50%, 75% and 100% of the process range of the transmitter.
The corresponding transmitter output,
in milliamps, is observed and recorded.
This is called a “5-points” check.
Then, in order to check for hysteresis,
a phenomenon whereby the sensor output for a process value
is different going 'downscale' as it is going 'upscale',
the output signals corresponding to 100%,
75%, 50%, 25%, and 0% in order are recorded.
The deviations at each check point are calculated
and compared to the deviation maximum allowed for the device.
If the deviation is greater than the maximum allowed,
then a full calibration is performed.
If the deviation is less than the maximum allowed,
then a sensor calibration is not required.
Let's assume that the maximum deviation tolerance is 0.5%.
Using the data from the calibration chart,
we see from the graph that the deviations are all less
than the maximum deviation allowed of 0.5%.
Therefore, no additional calibration is required.
Now let's assume that the maximum deviation tolerance is 0.20%.
Using the data from the calibration chart,
we see from the graph that some deviations are greater than
the maximum deviation allowed of 0.20%.
Therefore, a sensor calibration is required.
To calibrate, we need a very accurate process simulator,
in this case a pressure supply,
connected to the process side of the transmitter.
A current meter is attached to the output
to measure the transmitter’s 4-20 mA output.
Ideally, a National Institute for Standards and Testing-calibrated simulator
and current meter are used.
In practice, we can use very accurate process meters and pressure input modules.
If we have an analog transmitter,
we must adjust zero and span to reduce the measurement error.
With an analog transmitter,
there is a ZERO and SPAN adjustment on the transmitter itself.
Zero adjustment is made to move the output to exactly 4 mA
when a 0% process measurement is applied to the transmitter,
and the Span adjustment is made to move the output
to exactly 20 mA when a 100% process measurement is applied.
Unfortunately, with analog transmitters,
the zero and span adjustments are interactive;
that is, adjusting one moves the other.
Therefore, the calibration is an iterative process to set zero and span,
but only 2 to 3 iterations are usually required.
With a digital transmitter, we can adjust the incoming sensor signal
by adjusting the Analog to Digital converter output,
which is called “sensor trim”, and/or the input to the Digital to Analog converter
in the output circuit, which is called “4-20 mA trim” or “output trim”.
After calibration, the errors are graphed once again.
As with the “as found” values, there is some degree of hysteresis.
However, the maximum deviation has been reduced
from 0.38% to 0.18%, well within the tolerance of 0.20%.
In this video, you learned the importance of sensor calibration
of a measurement signal.
Calibration is an adjustment or set of adjustments
performed on a sensor or instrument
to make that instrument function as accurately, or error free, as possible.
Proper sensor calibration will yield accurate measurements,
which in turn, makes good control of the process possible.
When good control is realized,
then the process has the best chance of running efficiently and safely.
Want to learn PLC programming in an easy to understand format
and take your career to the next level?
Head on over to realpars.com
Browse More Related Video
what Is Instrument Calibration. Instrument Calibrator. RTD Calibration. Calibration certificates.
How to calibrate RTD temperature transmitters - Beamex
Model Predictive Control of Boost Converter
The Critical Role of Supply Chains in Business and Society
Introduction To SCADA System
On-Site Balancing Guide (balancing preparation, procedure, advices)
5.0 / 5 (0 votes)