Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
7 pages
1 file
In this paper, we propose that measurements and instrumentation are essential to maintaining the stability and integrity of any system. Any general system has a healthy and acceptable range of parameter values. In scientific, medical or engineering environments, such parameter values need consistency; validation and affirmation ensure that the scientific, medical and engineering processes within the system are within their respective ranges. Such a confirmation ascertains system and process stability. In most cases, the instruments are administered, read, processed and monitored by human beings as a part of managing a smaller segment of a more encompassing process.
This paper shows the need to develop international standards for describing the reliability of realtime measurement values coming from intelligent measuring instruments embedded in equipment. New technologies (microprocessors, digital communications, emerging physical principles for measurement, etc.) make possible the development of intelligent measurement instruments (IMI's) provided with automatic self-diagnostics and self-validation. In this paper IMI characteristics are considered; the need to draw up development of guidelines is also discussed. Technical propositions are based upon the experience of the IMI development and national standards (BS 7986:2001, BS 7986:2005, UK and MI 2021-1989. This is the first international contribution to this field, which is being carried out by a partnership of Russian and British scientists.
British Journal of Anaesthesia, 2012
This paper describes the design of a robust PID controller for propofol infusion in children and presents the results of clinical evaluation of this closed-loop system during endoscopic investigations in children age 6y-17y. The controller design is based on a set of models that describes the interpatient variability in the response to propofol infusion in the study population. The PID controller is tuned to achieve sufficient robustness margins for the identified uncertainty. 108 children were enrolled in the study, anesthesia was closed-loop controlled in 102 of these cases. Clinical evaluation of the system shows that closed-loop control of both induction and maintenance of anesthesia in children based on the W AVCNS index as a measure of clinical effect is feasible. A robustly tuned PID controller can accommodate the inter-patient variability in children and spontaneous breathing can be maintained in most subjects.
IEEE Engineering in Medicine and Biology Magazine, 1988
Artificial Intelligence in Medicine, 2002
The validation of a software product is a fundamental part of its development, and focuses on an analysis of whether the software correctly resolves the problems it was designed to tackle. Traditional approaches to validation are based on a comparison of results with what is called a gold standard. Nevertheless, in certain domains, it is not always easy or even possible to establish such a standard. This is the case of intelligent systems that endeavour to simulate or emulate a model of expert behaviour. This article describes the validation of the intelligent system computer-aided foetal evaluator (CAFE), developed for intelligent monitoring of the antenatal condition based on data from the non-stress test (NST), and how this validation was accomplished through a methodology designed to resolve the problem of the validation of intelligent systems. System performance was compared to that of three obstetricians using 3450 min of cardiotocographic (CTG) records corresponding to 53 different patients. From these records different parameters were extracted and interpreted, and thus, the validation was carried out on a parameter-by-parameter basis using measurement techniques such as percentage agreement, the Kappa statistic or cluster analysis. Results showed that the system's agreement with the experts is, in general, similar to agreement between the experts themselves which, in turn, permits our system to be considered at least as skilful as our experts. Throughout our article, the results obtained are commented on with a view to demonstrating how the utilisation of different measures of the level of agreement existing between system and experts can assist not only in assessing the aptness of a system, but also in highlighting its weaknesses. This kind of assessment means that the system can be ®ne-tuned repeatedly to the point where the expected results are obtained.
2008
This paper discusses the various methods by which instruments used in industries and process control can be computer-based. Instruments can be connected to computers so that collecting, controlling, and adjusting parameters under the supervision of computer program is facilitated. This paper discusses various methods of data acquisition by which properties exhibited by various instruments can be acquired in an appropriate form, processed, and made ready for the computer environment. This paper also examines the means of transmitting acquired signals to the appropriate interface system, which eventually links the computer system for processing and control. It thereafter concludes by advising the instrument and process control engineers to understand the main principles of communication to ensure proper integration into process control system and industrial environment.
2010
In this paper, the modelling and thus the performance evaluation relating to the dependability are studied for structures which have intelligence in the instruments constituting the Safety Instrumented Systems (SIS) in order to determine the contribution of the intelligent instruments in the safety applications. Dynamic approach using Stochastic Petri Nets (SPN) is proposed and the metrics used for the evaluation of the dependability of the Intelligent Distributed Safety Instrumented Systems (IDSIS) refer to two modes of failures mentioned by the safety standards: mode of dangerous failure and mode of safe failure.
A REVIEW STUDY ON INSTRUMENTATION AND CONTROL ENGINEERING, 2023
In the present review, instrumentation and control engineering (ICE) is defined as a branch of engineering that studies the measurement and control of process variables, and the design and implementation of systems that incorporate them. Instrumentation and control engineering (ICE) combines two branches of engineering. Instrumentation engineering is the science of the measurement and control of process variables within a production or manufacturing area. Meanwhile, control engineering, also called control systems engineering, is the engineering discipline that applies control theory to design systems with desired behaviors. Control engineers are responsible for the research, design, and development of control devices and systems, typically in manufacturing facilities and process plants. Control methods employ sensors to measure the output variable of the device and provide feedback to the controller so that it can make corrections toward desired performance. Automatic control manages a device without the need of human inputs for correction, such as cruise control for regulating a car's speed. In the present study a comprehensive review study on instrumentation and control engineering have been presented. The study was considered from different viewpoints which includes general introduction to instrumentation and control engineering; a comprehensive instrumentation that deliberates the present subject from the consideration of introduction, historical background and development, applications, measurement parameters, instrumentation engineering, impact of modern development; control engineering from the consideration of introduction, overview, history, control theory, control systems, control engineering education, control engineering careers, and recent advancement; and the last section is the conclusions.
IEEE Instrumentation & Measurement Magazine, 2021
2004
This paper provides an overview into recent developments in self-validating sensors. This concept assumes the availability of internal computing power for selfdiagnostics, and of digital communications to convey measurement and diagnostic data. A generic set of metrics are proposed for describing measurement quality, including online uncertainty. A SEVA instrument, based on the Coriolis mass flow meter is described; its ability to detect and compensate for the effects of two-phase flow has been implemented in a commercial meter. SEVA has been incorporated into a British Standard, which is currently being extended. Other related standardisation efforts include work by the European user organisations WIB and NAMUR, who are collaborating on an initiative to develop a common framework for describing sensor diagnostics on-line. Comparison with the SEVA Coriolis meter show some of the limitations of the WIB approach. Recent theoretical developments in SEVA include a simple technique for combining the outputs of redundant SEVA sensors for consistency checking and the calculation of a combined best estimate.
2002
The development of Instrument Surveillance and Calibration Verification (ISCV) systems for complex processes requires an empirical model to provide estimations of the process measurements. The residual differences between the estimations and measurements are then evaluated to determine the proper operation of the process sensors. This work presents the results of applying two different empirical modeling strategies to a set of 55 process sensors from a nuclear power plant. The application of the Neural Network Partial Least Squares (NNPLS) algorithm for ISCV systems has been developed at the University of Tennessee, and the Process Evaluation and Analysis by Neural Operators (PEANO) system utilizing autoassociative artificial neural networks (AANN) has been developed at the Institutt for Energiteknikk in Halden, Norway. The case study presented illustrates the performance of both systems on historical data which lacks the common features of high correlations, normally present in highly redundant signal sets. While redundant information typically causes numerical instabilities, due to a rank deficient predictor variable matrix, the NNPLS model first performs an orthogonal transformation to combat these instabilities. Moreover, the NNPLS results show a direct relationship between the maximum signal correlations, and the prediction accuracy of the empirical model. With a direct neural network approach, other methods are employed during training to stabilize the solution. Three months of data were available, from which the models were developed and evaluated. The average error of the NNPLS ISCV system predictions was 1.11%, calculated as the average absolute difference with respect to the mean value of the measured signals. Direct comparisons between the NNPLS and PEANO systems are presented for 5 sensors, including an example of a faulted steam flow sensor.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
IFAC Proceedings Volumes, 2010
Eastern-European Journal of Enterprise Technologies, 2022
Proceedings of 2nd IFAC Symposium on Intelligent Components and Instruments for Control Applications, SICICA, 1994
IEEE Transactions on Instrumentation and Measurement, 1994
IEEE Aerospace and Electronic Systems Magazine, 2000
IEEE Instrumentation & Measurement Magazine, 1998
Journal of the Association for Laboratory Automation, 2007
Vision Electronica Algo Mas Que Un Estado Solido, 2014
2001 Conference Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2001
Journal of Aerospace Engineering, 2014