Inspection Training


Class Information
Tooling U-SME classes are offered at the beginner, intermediate, and advanced levels. The typical class consists of 12 to 25 lessons and will take approximately one hour to complete.
Class Name:Calibration Fundamentals 210
Description:This class describes the calibration process and explains how measuring instruments are traced back to national and international standards. Includes an Interactive Lab.
Prerequisites: 350110  350115 
Difficulty:Intermediate
Number of Lessons:20
Language:English, Spanish
 
Go to Catalog

Class Outline
  • Objectives
  • The Importance of Calibration
  • Measurement Standards
  • Hierarchy of Measurement Standards
  • Traceability
  • ISO 9000 Requirements
  • Working Standards
  • Gage Blocks
  • Measurement Uncertainty
  • Uncertainty vs. Error
  • Random and Systematic Errors
  • The Calibration Process
  • Calibrating a Micrometer
  • Calibration Records
  • The Calibration Report
  • Calibration Cycles
  • Factors Affecting Calibration
  • In-House vs. Outside Calibration
  • How Calibration Affects Cost
  • Summary
  
Class Objectives
  • Describe the main purpose of calibration.
  • Explain why calibration requires measurement standards.
  • Identify the hierarchy of measurement standards.
  • Define traceability.
  • Describe the ISO 9000 calibration requirement.
  • Describe the role of working standards.
  • Describe how gage blocks are used in calibration.
  • Define measurement uncertainty.
  • Distinguish between uncertainty and error.
  • Distinguish between random and systematic errors.
  • List the steps in the calibration process.
  • Describe how to calibrate a micrometer.
  • Identify the purpose of a calibration record.
  • Identify the contents of a calibration report.
  • Explain the importance of regular calibration.
  • Identify the key factors that affect calibration.
  • Describe factors that determine in-house or outside calibration.
  • Describe how calibration affects cost.

Class Vocabulary

Vocabulary TermDefinition
accuracy The difference between a measurement reading and the true value of that measurement.
anvil The fixed, nonadjustable block on a micrometer. The face of the anvil is used as the reference from which the dimension is taken.
auditor An individual outside of an organization who objectively evaluates the effectiveness of a company's quality system.
calibration The comparison of a device with unknown accuracy to a device with a known, accurate standard to eliminate any variation in the device being checked.
calibration laboratory A controlled test environment where higher-level calibration is performed. These calibration results should be traced back to NIST.
calibration record A document displayed with a measuring instrument that contains information about its calibration. Calibration records help maintain accuracy and traceability.
calibration report A document that contains information about a particular calibration procedure. Calibration reports maintain traceability.
correction factor The amount of deviation in a measurement that is accounted for in the calibration process. The correction factor can be added to the measured value, or the measuring instrument can be adjusted.
dial indicator A measuring instrument with a contact point attached to a spindle and gears that move a pointer on the dial. Dial indicators have graduations that are available for reading different measurement values.
drift The actual change in the measurement value when the same characteristic is measured under the same conditions, with the same operator, at different points in time. Drift indicates how often a measurement needs recalibration.
error The amount of deviation from a standard or specification. Errors should be eliminated in the measuring process.
gage block A hardened steel block that is manufactured with highly accurate dimensions. Gage blocks are available in a set of standardized lengths.
hierarchy A group of items classified lowest to highest according to ability. The hierarchy of measurement standards is classified according to quality.
international standard A measurement standard recognized by international agreement and used as the basis for assigning values to other standards.
ISO 9000 A collection of documents that lists requirements for the creation and implementation of an effective quality system.
ISO 9001:2000 The section of the ISO 9000 standard containing the list of requirements. ISO 9001:2000 is the auditable section of the standard.
light wave The pulsation in space that transmits light energy. Light wave values are used to determine primary standards.
machine tool A power-driven piece of metalworking equipment for cutting or forming metal.
master thread gage A gage that is used to calibrate thread ring gages. The master thread gage inspects the internal threads of the ring gages.
measurement standard A recognized true value. Calibration must compare measurement values to a known standard.
metrologist A scientist of measurement. Metrologists test the highest-quality standards.
micrometer A hand-held measuring device used to inspect the dimensions of parts. The typical micrometer is accurate within 0.001 in. or 0.02 mm.
National Institute of Standards and Technology An organization required by law to maintain national standards. This organization is formerly known as the National Bureau of Standards.
precision The ability of a process to repeat the same accurate measurement over time.
primary standard A measurement standard with the highest quality. A light wave is used to determine a primary standard.
quality system The objectives and processes of a company designed to focus on quality and customer satisfaction.
random error An error that results from unpredictable variations from one or more influenced quantities. Random errors are inconsistent and easily recognizable.
secondary standard A measurement standard that is used in comparison with a primary standard. They are also known as transfer standards.
spindle A rotating component on a micrometer that advances toward the anvil to make contact with the part.
systematic error An error that is not determined by chance but is introduced by an inaccuracy in the system. Systematic errors are predictable and expected.
time meter A device used with a measuring instrument that records the number of hours an instrument operates. Time meters help determine calibration cycles.
tolerance An unwanted but acceptable variation from a specified dimension.
traceability The ability to verify each higher step of calibration until the NIST international standards are reached.
transfer standard A measurement standard that is used in comparison with a primary standard. They are also known as secondary standards.
true value A measurement value with no errors. The true value can never be known with total certainty.
uncertainty The measurement range in which the true value of a measurement is expected to lie. Uncertainty is an estimation of error.
variation A difference between two or more similar things.
working gage block A type of gage block that is used to calibrate measuring instruments. These are generally either Grade 2 or 3 gage blocks.
working standard A measurement standard used to calibrate or check measuring instruments. Gage blocks are common working standards.
wringing Bringing two surfaces of microinch flatness together so that they adhere, leaving only microinch separation. Gage blocks are wrung together in various combinations to form any length.