![]() | ||
Industrial process data validation and reconciliation, or more briefly, data validation and reconciliation (DVR), is a technology that uses process information and mathematical methods in order to automatically correct measurements in industrial processes. The use of DVR allows for extracting accurate and reliable information about the state of industry processes from raw measurement data and produces a single consistent set of data representing the most likely process operation.
Contents
- Models data and measurement errors
- Error types
- Necessity of removing measurement errors
- History
- Data reconciliation
- Redundancy
- Example of calculable and non calculable systems
- Benefits
- Data validation
- Data filtering
- Result validation
- Gross error detection
- Advanced data validation and reconciliation
- Thermodynamic models
- Gross error remediation
- Workflow
- Applications
- References
Models, data and measurement errors
Industrial processes, for example chemical or thermodynamic processes in chemical plants, refineries, oil or gas production sites, or power plants, are often represented by two fundamental means:
- Models that express the general structure of the processes,
- Data that reflects the state of the processes at a given point in time.
Models can have different levels of detail, for example one can incorporate simple mass or compound conservation balances, or more advanced thermodynamic models including energy conservation laws. Mathematically the model can be expressed by a nonlinear system of equations
Error types
Data originates typically from measurements taken at different places throughout the industrial site, for example temperature, pressure, volumetric flow rate measurements etc. To understand the basic principles of DVR, it is important to first recognize that plant measurements are never 100% correct, i.e. raw measurement
- random errors due to intrinsic sensor accuracy and
- systematic errors (or gross errors) due to sensor calibration or faulty data transmission.
Random errors means that the measurement
Other sources of errors when calculating plant balances include process faults such as leaks, unmodeled heat losses, incorrect physical properties or other physical parameters used in equations, and incorrect structure such as unmodeled bypass lines. Other errors include unmodeled plant dynamics such as holdup changes, and other instabilities in plant operations that violate steady state (algebraic) models. Additional dynamic errors arise when measurements and samples are not taken at the same time, especially lab analyses.
The normal practice of using time averages for the data input partly reduces the dynamic problems. However, that does not completely resolve timing inconsistencies for infrequently-sampled data like lab analyses.
This use of average values, like a moving average, acts as a low-pass filter, so high frequency noise is mostly eliminated. The result is that, in practice, data reconciliation is mainly making adjustments to correct systematic errors like biases.
Necessity of removing measurement errors
ISA-95 is the international standard for the integration of enterprise and control systems It asserts that:
Data reconciliation is a serious issue for enterprise-control integration. The data have to be valid to be useful for the enterprise system. The data must often be determined from physical measurements that have associated error factors. This must usually be converted into exact values for the enterprise system. This conversion may require manual, or intelligent reconciliation of the converted values [...]. Systems must be set up to ensure that accurate data are sent to production and from production. Inadvertent operator or clerical errors may result in too much production, too little production, the wrong production, incorrect inventory, or missing inventory.
History
DVR has become more and more important due to industrial processes that are becoming more and more complex. DVR started in the early 1960s with applications aiming at closing material balances in production processes where raw measurements were available for all variables. At the same time the problem of gross error identification and elimination has been presented. In the late 1960s and 1970s unmeasured variables were taken into account in the data reconciliation process., DVR also became more mature by considering general nonlinear equation systems coming from thermodynamic models., , Quasi steady state dynamics for filtering and simultaneous parameter estimation over time were introduced in 1977 by Stanley and Mah. Dynamic DVR was formulated as a nonlinear optimization problem by Liebman et al. in 1992.
Data reconciliation
Data reconciliation is a technique that targets at correcting measurement errors that are due to measurement noise, i.e. random errors. From a statistical point of view the main assumption is that no systematic errors exist in the set of measurements, since they may bias the reconciliation results and reduce the robustness of the reconciliation.
Given
where
The term
In other words, one wants to minimize the overall correction (measured in the least squares term) that is needed in order to satisfy the system constraints. Additionally, each least squares term is weighted by the standard deviation of the corresponding measurement.
Redundancy
Data reconciliation relies strongly on the concept of redundancy to correct the measurements as little as possible in order to satisfy the process constraints. Here, redundancy is defined differently from redundancy in information theory. Instead, redundancy arises from combining sensor data with the model (algebraic constraints), sometimes more specifically called "spatial redundancy", "analytical redundancy", or "topological redundancy".
Redundancy can be due to sensor redundancy, where sensors are duplicated in order to have more than one measurement of the same quantity. Redundancy also arises when a single variable can be estimated in several independent ways from separate sets of measurements at a given time or time averaging period, using the algebraic constraints.
Redundancy is linked to the concept of observability. A variable (or system) is observable if the models and sensor measurements can be used to uniquely determine its value (system state). A sensor is redundant if its removal causes no loss of observability. Rigorous definitions of observability, calculability, and redundancy, along with criteria for determining it, were established by Stanley and Mah, for these cases with set constraints such as algebraic equations and inequalities. Next, we illustrate some special cases:
Topological redundancy is intimately linked with the degrees of freedom (
When speaking about topological redundancy we have to distinguish between measured and unmeasured variables. In the following let us denote by
i.e. the redundancy is the difference between the number of equations
Simple counts of variables, equations, and measurements are inadequate for many systems, breaking down for several reasons: (a) Portions of a system might have redundancy, while others do not, and some portions might not even be possible to calculate, and (b) Nonlinearities can lead to different conclusions at different operating points. As an example, consider the following system with 4 streams and 2 units.
Example of calculable and non-calculable systems
We incorporate only flow conservation constraints and obtain
If we have measurements for
In 1981, observability and redundancy criteria were proven for these sorts of flow networks involving only mass and energy balance constraints. After combining all the plant inputs and outputs into an "environment node", loss of observability corresponds to cycles of unmeasured streams. That is seen in the second case above, where streams a and b are in a cycle of unmeasured streams. Redundancy classification follows, by testing for a path of unmeasured streams, since that would lead to an unmeasured cycle if the measurement was removed. Measurements c and d are redundant in the second case above, even though part of the system is unobservable.
Benefits
Redundancy can be used as a source of information to cross-check and correct the measurements
Data validation
Data validation denotes all validation and verification actions before and after the reconciliation step.
Data filtering
Data filtering denotes the process of treating measured data such that the values become meaningful and lie within the range of expected values. Data filtering is necessary before the reconciliation process in order to increase robustness of the reconciliation step. There are several ways of data filtering, for example taking the average of several measured values over a well-defined time period.
Result validation
Result validation is the set of validation or verification actions taken after the reconciliation process and it takes into account measured and unmeasured variables as well as reconciled values. Result validation covers, but is not limited to, penalty analysis for determining the reliability of the reconciliation, or bound checks to ensure that the reconciled values lie in a certain range, e.g. the temperature has to be within some reasonable bounds.
Gross error detection
Result validation may include statistical tests to validate the reliability of the reconciled values, by checking whether gross errors exist in the set of measured values. These tests can be for example
If no gross errors exist in the set of measured values, then each penalty term in the objective function is a random variable that is normally distributed with mean equal to 0 and variance equal to 1. By consequence, the objective function is a random variable which follows a chi-square distribution, since it is the sum of the square of normally distributed random variables. Comparing the value of the objective function
The individual test compares each penalty term in the objective function with the critical values of the normal distribution. If the
Advanced data validation and reconciliation
Advanced data validation and reconciliation (DVR) is an integrated approach of combining data reconciliation and data validation techniques, which is characterized by
Thermodynamic models
Simple models include mass balances only. When adding thermodynamic constraints such as energy balances to the model, its scope and the level of redundancy increases. Indeed, as we have seen above, the level of redundancy is defined as
Gross error remediation
Gross errors are measurement systematic errors that may bias the reconciliation results. Therefore it is important to identify and eliminate these gross errors from the reconciliation process. After the reconciliation statistical tests can be applied that indicate whether or not a gross error does exist somewhere in the set of measurements. These techniques of gross error remediation are based on two concepts:
Gross error elimination determines one measurement that is biased by a systematic error and discards this measurement from the data set. The determination of the measurement to be discarded is based on different kinds of penalty terms that express how much the measured values deviate from the reconciled values. Once the gross errors are detected they are discarded from the measurements and the reconciliation can be done without these faulty measurements that spoil the reconciliation process. If needed, the elimination is repeated until no gross error exists in the set of measurements.
Gross error relaxation targets at relaxing the estimate for the uncertainty of suspicious measurements so that the reconciled value is in the 95% confidence interval. Relaxation typically finds application when it is not possible to determine which measurement around one unit is responsible for the gross error (equivalence of gross errors). Then measurement uncertainties of the measurements involved are increased.
It is important to note that the remediation of gross errors reduces the quality of the reconciliation, either the redundancy decreases (elimination) or the uncertainty of the measured data increases (relaxation). Therefore it can only be applied when the initial level of redundancy is high enough to ensure that the data reconciliation can still be done (see Section 2,).
Workflow
Advanced DVR solutions offer an integration of the techniques mentioned above:
- data acquisition from data historian, data base or manual inputs
- data validation and filtering of raw measurements
- data reconciliation of filtered measurements
- result verification
- range check
- gross error remediation (and go back to step 3)
- result storage (raw measurements together with reconciled values)
The result of an advanced DVR procedure is a coherent set of validated and reconciled process data.
Applications
DVR finds application mainly in industry sectors where either measurements are not accurate or even non-existing, like for example in the upstream sector where flow meters are difficult or expensive to position (see ); or where accurate data is of high importance, for example for security reasons in nuclear power plants (see ). Another field of application is performance and process monitoring (see ) in oil refining or in the chemical industry.
As DVR enables to calculate estimates even for unmeasured variables in a reliable way, the German Engineering Society (VDI Gesellschaft Energie und Umwelt) has accepted the technology of DVR as a means to replace expensive sensors in the nuclear power industry (see VDI norm 2048,).