Inspection Training

Class Information
Inspection Training Tooling U-SME classes are offered at the beginner, intermediate, and advanced levels. The typical class consists of 12 to 25 lessons and will take approximately one hour to complete.
Class Name:Linear Instrument Characteristics 115
Description:This class describes the various characteristics of linear measuring instruments and explains how variation affects the inspection process. Includes an Interactive Lab.
Number of Lessons:15
Language:English, Spanish
Go to Catalog

Class Outline
  • Objectives
  • The Importance of Measuring Length
  • What Is Variation?
  • Accuracy
  • Error In Measurement
  • Precision
  • Calibration
  • Stability
  • Linearity
  • Resolution
  • Amplification
  • Hysteresis
  • Factors Affecting Measurement
  • Cost of Measurement
  • Summary
Class Objectives
  • Identify the most common measured quantity.
  • Define variation.
  • Define accuracy.
  • Describe how errors affect the measuring process.
  • Define precision.
  • Describe how calibration eliminates errors.
  • Define stability.
  • Define linearity.
  • Define resolution.
  • Define amplification.
  • Define hysteresis.
  • Identify the factors that affect measurement.
  • Describe how inspection affects costs.

Class Vocabulary

Vocabulary TermDefinition
accuracy The difference between a measurement reading and the true value of that measurement.
amplification The movement of a measuring instrument's contact points in relation to the amount of readout on the needle or scale.
bias The predicted difference on average between the measurement and the true value. Bias is also known as accuracy.
calibration The comparison of a device with unknown accuracy to a device with a known, accurate standard to eliminate any variation in the device being checked.
caliper A measuring instrument with two pairs of jaws on one end and a long beam containing a marked scale of unit divisions. One pair of jaws measures external features; the other pair measures internal features.
coordinate measuring machine A sophisticated measuring instrument with a flat polished table and a suspended probe that measures parts in three-dimensional space.
correction factor The amount of deviation in a measurement that is accounted for in the calibration process. You can either add the correction factor to the measured value or adjust the measuring instrument.
depth gage A type of measuring instrument that measures the depth of holes, slots, or recesses.
dial indicator A measuring instrument with a contact point attached to a spindle and gears that moves a pointer on the dial. Dial indicators have graduations that are available for reading different measurement values.
discrimination The distance between two lines on a scale or the fineness of an instrument's divisions of measurement units.
drift The actual change in the measurement value when the same characteristic is measured under the same conditions, same operator, at different points in time. Drift indicates how often a measurement needs recalibration.
error The amount of deviation from a standard or specification. Errors should be eliminated in the measuring process.
error of measurement The actual difference between a measurement value and the known standard value.
gage A device that determines whether or not a part feature is within specified limits. Most gages do not provide an actual measurement value. However, measuring instruments are also sometimes called gages.
granite A dense, wear-resistant material that is capable of excellent flatness. Granite is often used for inspection surfaces.
graph A diagram that represents the variation of one variable compared to another.
height gage A type of measuring instrument with a precision finished base, a beam that is at a right angle to the base, and an indicator.
hysteresis The delay between the action and reaction of a measuring instrument. Hysteresis is the amount of error that results when this action occurs.
linearity The amount of error change throughout an instrument's measurement range. Linearity is also the amount of deviation from an instrument's ideal straight-line performance.
measuring instrument A device used to inspect, measure, test, or examine parts in order to determine compliance with required specifications.
micrometer A U-shaped measuring instrument with a threaded spindle that slowly advances toward a small anvil. Micrometers are available in numerous types for measuring assorted dimensions and features.
plug gage A hardened, cylindrical gage used to inspect the size of a hole. Plug gages are available in standardized diameters.
precision The degree to which an instrument will repeat the same measurement over a period of time.
repeatability The ability to obtain consistent results when measuring the same part with the same measuring instrument.
resolution The smallest change in a measured value that the instrument can detect. Resolution is also known as sensitivity.
rule of ten The inspection guideline stating that a measuring instrument must be ten times more precise than the acceptable tolerance of the inspected part feature.
slope The angle of a line that appears when comparing two variables on a graph.
specified range of measurement The limit of measurement values that an instrument is capable of reading. The dimension being measured must fit inside this range.
stability The ability of a measuring instrument to retain its calibration over a long period of time. Stability determines an instrument's consistency over time.
standard A recognized true value. Calibration must compare measurement values to a known standard.
systematic error An error that is not determined by chance but is introduced by an inaccuracy in the system. Systematic errors are predictable and expected.
thermal characteristic The way a material behaves due to changes in heat. Measuring instruments have thermally stable characteristics so that they are not affected by temperature increases.
tolerance The unwanted but acceptable deviation from a desired dimension.
variation A difference between two or more similar things.