Instrumentation & Measurement Muhajir Ab. Rahim School of Mechatronic Engineering Universiti Malaysia Perlis
Introduction We make measurement everyday. (measure our weight, body temperature..) The concept of measurement has been deep rooted in the human culture since the origin of civilization In ancient times, how they measure time? length? weight?
Why do we need Instrumentation & Measurement? To determine various parameters/ information of the system or a process. To control the system variables (temperature, pressure, flow rate..) based on the collected measurement data . Currently, automatic control systems are widely used in process industries (oil refineries, chemical plants..) and modern sophisticated systems (missile guidance, radar tracking system..)
What is Instrumentation? Instrumentation is a field of study and work centering on measurement and control of physical processes. These physical processes include pressure, temperature, flow rate, chemical consistency and etc. An instrument is a device that measures and/or acts to control any kind of physical process. Due to the fact that electrical quantities of voltage and current are easy to measure, manipulate, and transmit over long distances, they are widely used to represent such physical variables and transmit the information to remote locations.
What is Measurement? Measurement is an act of assigning a specific value to a physical variable. Measurand is a variable whose value is measured. Metrology is a field of knowledge concerned with measurement. All measurements have 1) Magnitude 2) Uncertainty 3) Units
Method of Measurement All measurements involve a comparison between a measured quantity and a reference standard. There are two fundamental methods of measurement 1) Deflection mode 2) Null mode
Deflection Mode A deflection mode is a measuring method that uses signal energy to cause a deflection in the output indicator. For example, the unknown weight of an object can be easily obtained by the deflection of a spring cause by it on the spring balance. Spring 10 20 30 Scale
Null Mode Null mode is a measuring method that balances a known amount against an unknown amount to determine the measured value For example in equal-arm beam balance, an unknown mass when placed in the pan, causes the beam and pointer to deflect. Mass of known values are placed on the other pan till a balanced or null condition is obtained by the means of the pointer. Equal arm 2 2 Standard mass Unknown mass 1 1 Null position
Classification of Instruments Analog and Digital Types Self-Generating (Passive) and Power-Operated (Active) Types Contacting and Non-Contacting Types
Exercise Give example of analog and digital types instrument Give example of Passive and Active types instrument Give example of Contacting and Non-Contacting Types instrument Multimeter Passive- thermometer, Bourdon tube pressure gauge Active- Accelerometer, ultrasound sensor Non-Contacting- Pyrometer, variable reluctance tachometer Contacting- Thermometer, Bourdon tube pressure gauge
Generalized Measuring System Most measuring systems fall within the framework of a general arrangement consisting of three stages: Stage 1. A detection-transduction (sensor-transducer stage) Stage 2. An intermediate stage (signal conditioning stage) Stage 3. A terminating (readout- recording stage)
Stages of the General Measurement System Sensor- Transducer Stage 2: Signal Conditioning Stage 3: Readout- Recording Senses desired input to exclusion of all others and provides analogous output Modifies transduced signal into form usable by final stage. Usually increases amplitude and/or power, depending on requirement. Provides an indication or recording in form that can be evaluated by operator (human) Mechanical: Contacting spindle, spring-mass, elastic devices (e.g., Bourdon tube for pressure, proving ring for force) Electrical: Contacts, resistance, capacitance, inductance, thermocouple Mechanical: Gearing, cranks, connecting links, cams, etc. Electrical: Amplifying or attenuating, bridges, filters Indicators (displacement type) Moving pointer and scale, liquid column Indicator (digital type): LCD display, digital printing, magnetic recording (hard disk)
Standard & Calibration Standard is the known value used as the basis of calibration Calibration is the act of applying a known input to a system to observe the system output.
Quality Parameters of an Instrumentation System (1/3) Resolution: It is defined as the smallest increment in the measured value that can be detected. Accuracy: It is a measure of difference between the measured value and actual value. Generally defined as percentage of actual value. Precision: It is the ability of an instrument to reproduce a certain set of readings within a given deviation. Repeatability: It is the ability to reproduce the output signal exactly when the same measured quantity is applied repeatedly under the same environmental conditions. Range : It is defined as the limits between which inputs can vary. Span : It is maximum value minus the minimum value of the input Drift: It is the change in the reading of an instrument of a fixed variable with time
Quality Parameters of an Instrumentation System (2/3) Sensitivity: It is the ability of the measuring instrument to respond to changes in measured quantity. It is ratio of change of output to change of input. Sensitivity, S =∆O/∆I I, input quantity to be sensed O, output signal which can be recorded
Quality Parameters of an Instrumentation System (3/3) Hysteresis: it is the difference in reading obtained when an instrument approaches a signal from opposite directions
Exercise Describe the (precision/accuracy) each of these. Low Accuracy High Precision High Accuracy Low Precision High Accuracy High Precision