Glossary of Terms



Absolute Pressure?– a measure relative to zero pressure.

Absolute Pressure Sensor – a sensor that measures input pressure in relation to relative zero pressure (a total vacuum on one side of the diaphragm) reference.

Accuracy?– a comparison of the actual output signal of a device to the true value of the input pressure. The various errors such as (linearity, hysteresis, repeatability and temperature shift) attributing to the accuracy of a device are usually expressed as a % of span.

Altimetric Pressure Transducer?– a barometric pressure transducer, used to determine altitude from the pressure-altitude profile.

Analog Output?– an electrical output from a transducer that changes proportionately, with any change in input pressure.

Auto Zeroing Technique?– a method used to automatically set the null point on a gage pressure transducer; usually done by using a microprocessor to open a solenoid valve at a predetermined time interval. This references atmospheric pressure to both sides of the gage pressure transducer. The microprocessor reads the output voltage and makes the new null point. This method is used to eliminate errors due to null offset and null temperature shift.


Barometric Pressure Transducer?– an absolute pressure sensor that measures the local ambient atmospheric pressure.

Bi-directional Differential Pressure Transducer?– a differential pressure transducer allowing the greater input pressure to be applied to either port.

Burst Pressure?– the specified pressure that will rupture the sensing element or transducer case causing leakage.


Calibration – an act of modifying transducer output to improve output accuracy, or to verify actual output vs. specification.

Calibration Curve – a graphical representation of the calibration record.

Chip – a die (un-packaged semiconductor device) cut from a silicon wafer, incorporating semiconductor circuit elements such as resistors, diodes, transistors, and/or capacitors.
Compensation – added circuitry or materials designed to counteract known sources of error.


Differential Pressure – the difference in pressure between two independent pressure sources, measured relative to a reference pressure.

Differential Pressure Transducer – a transducer that is designed to accept two independent and simultaneous pressure sources. The output is proportional to the pressure difference between the two sources.

Drift – an undesired change in output over a period of time that is not a function of any input pressure change.


End Points – pressure transducer outputs at specified upper and lower limits of the transducer range.

End Point Linearity – see Terminal Based Linearity (TBL).

Error – the algebraic difference between the indicated value and the true value of the input pressure. Usually expressed in percent of full span output, sometimes expressed in percent of the transducer output reading.

Error Band – the band of maximum deviations of the output values from a specified reference line or curve due to those causes attributable to the transducer. Usually expressed as ± percent of span output. The error band should be specified as applicable over at least two calibration cycles so as to include repeatability and verified accordingly.

Excitation – the external electrical voltage and/or current applied to a transducer for its proper operation (often referred to as the supply current or voltage).


Flow – is the motion of a fluid. A fluid can be actual liquid, gases, or movable solids (i.e., granular or slurries).

Flow Rate – the time rate of motion of a fluid expressed as a fluid quantity per unit time.

Flow Velocity – the time rate of motion of a fluid; expressed as a distance of fluid travel per unit time.

Frequency, Natural – the frequency of free (not forced) oscillations of the sensing element of a fully assembled transducer.

Frequency Output – an output in the form of frequency that varies as a function of the applied pressure.

Full-Scale Output (FOS) – the output at full-scale pressure at a specified supply voltage. The signal is the sum of the offset signal plus the full-scale span.

Full-Scale Span – the change in output over the operating pressure range at a specified supply voltage. The SPAN of a device is the output voltage variation given between zero differential pressure and any given pressure. FULL-SCALE SPAN is the output variation between zero differential pressure and the maximum recommended operating pressure.


Gage Pressure – a form of differential pressure measurement in which atmospheric pressure is used as a reference.


Head Pressure – the height of a liquid column at the base of which a given pressure would be developed.

Hysteresis – the difference in output reading at a pressure point when the pressure point is approached first with an increasing pressure (from zero) and then with the decreasing pressure from full scale pressure.


Impact Pressure – the pressure in a moving fluid exerted parallel to the direction of the flow due to the flow velocity.

Input Impedance (resistance) – the impedance (resistance) measured between the positive and the negative (ground) input terminals at a specified frequency with the output terminals open. ?


Laser Trimming (automated) – a method for adjusting the value of thick film resistors using a computer-controlled laser system.

Leakage Rate – the maximum rate at which a fluid is permitted or determined to leak through a seal. The type of fluid, the differential pressure across the seal, the direction of leakage, and the location of the seal must be specified.

Line Pressure – the maximum allowable pressure safely applied to a transducer. Line pressure is also the maximum reference pressure allowable on a differential pressure transducer.

Linearity (end-point) – see Terminal Based Linearity (TBL).

Linearity (linearity error) – the deviation of the transducer output curve from a specified straight line. Linearity error is usually expressed as a % of span output.

Load Impedance – the impedance presented to the output terminals of a transducer by the associated external circuitry.

Long Term Drift – see Stability.


Mass Flow Rate – flow rate expressed as fluid mass per unit time.

Maximum Excitation – the maximum value of supply voltage or current that can be applied to the transducer at room conditions without causing damage or performance degradation beyond specified tolerances.


Non-Linearity – see Linearity.

Null – the condition when the pressure on each side of the sensing diaphragm is equal.

Null Offset – the electrical output present when the pressure transducer is at null.

Null Output – see Zero Pressure Output.

Null Temperature Shift – the change in null output value due to a change in temperature.


Offset – see Zero Pressure Offset.

Operating Pressure Range – the range of pressure between minimum and maximum pressures at which the output will meet the specified operating characteristics.

Operating Temperature Range – the range of temperature between minimum and maximum temperature at which the output will meet the specified operating characteristics.

Output Impedance – the impedance across the output terminals of a transducer presented by the transducer to the associated external circuitry.

Output Noise – the rms, peak-to-peak (as specified) AC component of transducers DC output in the absence of a change in input pressure.

Overpressure – see Proof Pressure.


Partial Pressure – the pressure exerted by one constituent of a mixture of gases.

Piezoresistance – a resistive element that changes resistance, relative to the applied stress it experiences (e.g. strain gauge).

Pressure – a force per unit area.

Pressure Error – the maximum difference between true pressure and the inferred from the output for any pressure I the operating pressure range.

Pressure Hysteresis – the difference in output at any given pressure in the operating pressure range when this pressure is approached from the minimum operating pressure and when approached from the maximum operating pressure at room temperature.

Pressure Range – the minimum and maximum pressure that the transducer is calibrated or specified.

Pressure Sensor – a device that converts an input pressure, into an electrical output.

Proof Pressure – the specified pressure that may be applied to the sensing element of a transducer causing a permanent change in the output characteristics.


Quiescent Supply Current – the supply current being drawn, when the pressure transducer is at null.


Range – see Operating Pressure Range.

Ratiometric – ratiometricity refers to the ability of the transducer to maintain a constant sensitivity, at a constant pressure, over a range of supply voltage values.

Ratiometric (ratiometricity error) – at a given supply voltage, transducer output is a proporation of that supply voltage. Ratiometricity error is the change in this proportion resulting from any change to the supply voltage. Usually expressed as a % of full-scale output (FSO).

Reference Pressure – an independent pressure, ranging from zero pressure to some maximum pressure capability of a transducer used as a relative standard to compare the measured pressure.

Repeatability – the difference in output reading at a given pressure point when the pressure is applied consecutively from the same direction.

Resolution – the magnitude of output step changes as the pressure is continuously varied over the range. This term primarily applies to potentiometric sensors. Resolution for other pressure transducers is generally limited by sensitivity of the readout device. Usually expressed in % of full-scale output (FSO).

Response Time – the length of time required for the output of a transducer to rise to a specified % of its final output value as a result of a step change of input pressure.

Room Conditions – ambient environmental conditions under which a transducer must commonly operate; established as follows:
(a) Temperature: 25 ±?110°C (77±18°F). (b) Relative Humidity: 90% or less. (c) Barometric Pressure: 26“ to 32“ Hg. Note: Tolerances closer than shown may frequently be specified for transducer calibration and test environments.


Sealed Pressure Transducer – (PSIS) measures pressure with respect to an internal reference chamber sealed at atmospheric pressure. Gage pressure measurements below 100 PSI may require corrections for changes in atmospheric pressure and thermally induced ‘reference’ pressure errors.

Self Heating – internal heating of a transducer as a result of power dissipation.

Sensing Element – that part of a transducer that responds directly to changes in input pressure.

Sensitivity – the change in output per unit change in pressure for a specified supply voltage or current.

Sensitivity Shift – a change in sensitivity resulting from an environmental change such as temperature.

Sensor – a popular term sometimes used to describe a transducer. Technically a sensor is an un-compensated, low-level output device that converts the measured to an electronic detectable change, (i.e., resistance, capacitance, and inductance).

Signal Conditioning – to process the form or mode of a signal so as to make intelligible to, or compatible with, a given device, including such manipulation as pulse shaping, pulse clipping, compensating, digitizing, and linearizing.

Span – the algebraic difference between the upper and lower limits of the pressure range.

Stability – the ability of a transducer to retain its performance characteristics for a relatively long period of time, unless otherwise stated; stability is the ability of a transducer to reproduce output readings obtained during its original calibration at room conditions for a specified period of time. It is typically expressed as within % of full-scale output (FSO) for a period of “X” months.

Static Calibration – a calibration recording output vs. pressure at fixed points at room temperature.

Static Error Band – the error band applicable at room temperature.

Static Pressure – the pressure of a fluid exerted normal to the surface along which a fluid flows. A fluid can be liquid or gaseous.

Storage Temperature Range – the range of temperature between minimum and maximum that can be applied without causing the sensor to fail to meet the specified operating characteristics.

Strain Gauge – a sensing device providing a change in electrical resistance proportional to the level of applied stress.

Supply Voltage (current) – the voltage (current) applied to the positive and negative (ground) input terminals. ?


Temperature Coefficient of Full-Scale Span – the percent of change in full scale span per unit change in temperature relative to the full-scale span at a specified temperature.

Temperature Coefficient of Resistance – the percent change in the DC input impedance, per unit change in temperature relative to the DC input impedance at a specified temperature.

Temperature Error – the maximum change in output, at an input pressure, within the specified range resulting from a change in temperature.

Terminal Based Linearity (TBL) – or end point linearity is a method of defining linearity. The maximum deviation of any data point on a transducer output curve from a straight line drawn between the end data points on that output curve. TBL is approximately twice the magnitude of Best-Fit Straight Line (BFSL).

Thermal Offset Shift – see Temperature Coefficient of Offset.

Thermal Span Shift – see Temperature Coefficient of Full Scale Span.

Thermal Zero Shift – see Temperature Coefficient of Offset.

Thick Film – technology using screened on pastes to form conductor, resistor, thermistors, and insulator patterns; screened onto the substrate (usually ceramic) and cured by firing at elevated temperatures.

Thin Film – a technology using vacuum deposition of the conductors and dielectric materials onto a substrate (frequently silicon) to form an electrical circuit.

Total Flow – is the flow rate integrated over a time interval.

Total Pressure – (also called stagnation pressure or ram pressure) is the vector sum of the static pressure and impact pressure.

Transducer – a device or medium that converts energy from one form to another. The term is generally applied to devices that take physical phenomenon pressure, humidity, temperature, flow, etc., and convert it to an electrical signal.

Transmitter – a device specifically designed to enhance communication of information from one location to another. A Pressure Transmitter is a device with a 4-20mA output specified.


Unidirectional Differential Pressure Sensor – a differential pressure transducer requiring the greater input pressure to be applied to a specific pressure port.


Vacuum – a pressure less than atmospheric pressure (a perfect vacuum is the absence of gaseous fluid).

Vibration Error – the maximum change in output of a transducer when a specific amplitude and range of frequencies are applied to a specific axis at room temperature with no pressure applied.

Vibration Error Band – the error recorded in output of a transducer when subjected to a given set of amplitudes and frequencies.

Volumetric Flow Rate – is flow rate expressed as a fluid volume per unit time. ?


Zero – see Null.

Zero – the electrical output of zero volts at null conditions.

Zero Offset – see Null Offset.

Zero Pressure Offset – the output at zero pressure (absolute or differential, depending on the device type) for a specified supply voltage or current.