Friday, 23 January 2015

Basics Of Instrumentation [Definitions]


This is my first post towards this blog, and being a student of this stream I've learnt that there is no question as silly or meaningless, bearing this in mind I'll try to cover every possible question dealing with instrumentation and its counter-parts.
To start with, let us try to explore the basics first:

  • What is instrumentation: Instrumentation basically is the art of measuring the value of some plant parameter or process variables and supplying a signal that is proportional to the measured parameter.
  • What is instrument: Instrument is a device which is used to measure, monitor, display, etc. of a process variable.
  • What are the process variables: The process variables are: flow, pressure, level, temperature and quality (i.e., % D2, CO2, Ph etc.)
  • Define the above process variables with units:
  1. Flow: movement of a gas or a liquid in a steady and continuous current or stream.(Kg/hr, Liter/min)
  2. Pressure: Force acting per unit area. P=F/A (Pascal, N/m2, Kg/ms, pounds/inch2, bar)
  3. Level: Difference between two heights. (meters, mm, cm, %)
  4. Temperature: It is the degree of hotness or coldness of a body. (degree centigrade, degree Fahrenheit, degree Kelvin, degree ranking)
  5. Quality: It deals with analysis. (Ph, %CO2, %O2, conductivity, viscosity)

What is:


  • Accuracy: A number or quantity which defines the limit of error under reference conditions.
  • Precision: It is the closeness of repeated measurements with the same instrument in the same quantity.
  • Attenuation: A decrease in signal magnitude between two points, or between two frequencies.
  • Dead time: The time interval between initiation of an impact change or stimulus and the start of the resulting response.
  • Dead zone: It is the largest input change to which the transducer fails to respond.
  • Drift: It is an undesired change in output over a period of time, which is unrelated to input, operating conditions, or load.
  • Error: the difference between the indication and the true value of the measured signal.
  • Random errors: Random errors are those which tend to average out over the time of the series of measurements.
  • Systematic errors: Systematic errors do not average out over the time of measurement and indeed may be constant.
  • Span error: It is the difference between the actual span and the specified span and is expressed as the percent of specified span.
  • Zero error: It is the error of device operating under the specified conditions of use when the input is at the lower range value.
  • Static gain: It is the ratio of the output change to an input been change after the steady state has been reached.
  • Hysteresis: The maximum difference between the upscale and downscale indications of the measure signal during a full range traverse for the same input.
  • Interference: Interference is any false/fake voltage or current arising from external sources and appearing in the circuits of a device.
  • Common mode interference: It is the form of interference which appears between the measuring circuit terminals and ground.
  • Normal mode interference: It is the form of interference which appears between measuring circuit terminals.
  • Linearity: The closeness to which a curve approximates a straight line.
  • Non-linearity: It is the maximum deviation from a straight line joining the outputs at the extreme range divided by the output range.
  • Range: The region between the limits within which a quantity is measured, received or transmitted, is expressed by stating the lower and upper range values.
  • Repeatability: The closeness of agreement among a number of consecutive measurements of the output for the same value of the measured signal under the same operating conditions.
  • Reproducibility: The closeness of agreement among repeated measurements of the output for the same value of the input under the same operating conditions.
  • Response: It is the general behavior of the output of a device as a function of input both with respect to time.
  • Signal to noise ratio: Ratio of signal amplitude to noise.
  • Time constant: The time required for the output to complete 63.2% of the total rise or decay.
  • Span: The algebraic difference between upper and lower range values.
  • Zero shift: Any parallel shift of the input output curve.
  • Sensitivity: It is the rate of change of output of a system with respect to changes in input.
  • Resolution: It is the very very smallest change that an instrument can detect.
  • Transducer: It is a device which converts the energy taken from the measurand into an electrical signal.
  • Calibration: It is the process of the determination of the characteristics of a system by measurement of the output for a variety of unknown values.
  • Traceability: Its a term used to describe the unbroken chain of comparisons that qualify measuring instruments to national or international standards. In practice the national primary standard can be related to international standard.
  • Uncertainty: It is the range of values about the final result within which the true value is believed to lie.
  • Stability: It is the ability of the measuring instrument to be stable throughout the process.
  • Zero-stability: It is the ability of the transducer or instrument to restore its output to zero when its input returns to zero.
  • Monotonicity: It means that a transducer which is subjected to a continuously increasing input signal its output signal should neither decrease nor skip a value.

No comments:

Post a Comment