Share

Regular, properly performed calibration is crucial to meeting customers’ specifications for manufactured and machined parts. Here’s what you need to know.

For machine shops, sheet metal fabricators, 3D printing services and other manufacturers, consistent adherence to dimensional specifications is where the rubber meets the road.

And while high-quality metrology tools are essential to achieving that, they don’t guarantee that the measurements they provide are accurate.

Like CNC machine tools, even the most advanced coordinate measuring machine (CMM) or vision system is susceptible to performance degradation over time.

This makes regular, properly performed calibration a crucial part of any manufacturing process, even though some shops overlook or postpone the procedure until it’s too late.

Correct Calibration: No ‘Right or Wrong Way’

“What’s most important is to adhere to whatever guidelines were outlined in your shop’s quality manual, and that these meet any customer requirements,” says Patrick Sullivan, national account manager at Mitutoyo America Corp.

There isn’t necessarily a “right or wrong way” to calibrate inspection equipment, he explains. Instead, the correct procedures are determined by a host of variables, including the device manufacturer’s recommendations, environmental conditions, accuracy requirements, device stability, and criticality of the workpieces being measured.

The question then becomes: What guidelines and procedures are necessary to meet those requirements and make sure that every one of the shop’s gauge blocks, micrometers, vernier calipers, bore gauges, and more are not only accurate, but stay that way from one calibration procedure until the next?

What’s in a Meter?

To help customers get the best results, Jim Salsbury, Mitutoyo’s general manager of corporate metrology, hosts an online Metrology Training Lab on the company’s YouTube channel. Visitors to the site enjoy free, on-demand access to an extensive list of video lessons covering everything from outside micrometer calibration to the influence of temperature on measurements. And for those pursuing certification through the American Society for Quality (ASQ) or other qualifying bodies, practice tests are available.

Eliminating dimensional surprises on the shop floor spells greater profitability, less reworking and scrapping, and above all, better relations with customers.

Salsbury illustrates the potential for error-inducing variations by outlining how a standard unit of measurement—the meter—has evolved over time.

Once defined as “one ten-millionth of the distance from the equator to the North Pole” along a meridian passing through Paris, the meter was long represented by a platinum-iridium bar kept at the International Bureau of Weights and Measures in France.

In 1960, the meter was redefined as 1,650,753.73 wavelengths of the light emitted by krypton-86 atoms (each 605.8 nanometers in length) as they transitioned from one energy level to another.

And in 1983, it changed yet again when scientists agreed that a meter should be equal to the distance that light travels in a vacuum in 1/299,792,458 of a second.

Metrological Traceability

Why should that matter to workers who make parts for a living? Because of this: Calibration is based on dimensional standards (like the meter) that must be traceable to a known, widely accepted value or artifact.

For instance, calibrating your 1-inch micrometer usually means measuring a 1-inch gauge block and adjusting the micrometer’s graduated sleeve accordingly.

To meet metrological traceability, that gauge block must also be traceable back to the standard meter just described (or another known artifact), which is defined and maintained by the National Institute of Standards and Technology (NIST).

The same is true for every measuring device on the production floor or inspection lab. All must ultimately be traceable to a known and agreed-on industry standard. In addition, they must be calibrated at regular intervals or when accuracy is in doubt (after accidentally kicking it across the floor, for example).

5 Key Steps in Calibration

There are five distinct steps or activities within the calibration process, Salsbury notes. These include:

  • Determination of Reference Values: This involves precise measurement and documentation of a standard device (the gauge block used to calibrate your 1-inch micrometer, for instance), which can then be used as a reference in future measurements. Again, it must be traceable to a qualified standard.
  • Conformity or Acceptance Testing: Here, the device is checked to make certain it operates within specified tolerance limits. The gauge block just described is measured several times with a micrometer or dial calipers and the results documented. If the device consistently reads within a predefined tolerance band, no further steps are necessary.
  • Adjustment or Correction: If the device is found to be out of tolerance during a calibration check, it will need to be adjusted or corrected to improve its accuracy—turning the barrel, in our micrometer example. This adjustment process is often lumped into the term calibration.
  • User Calibration: This type of calibration involves the routine checks and adjustments made by the user of the measuring equipment. Examples include zero-setting a dial indicator or bore gauge before use and adjusting for errors as needed.
  • Interim Check or Verification of Calibration: To reduce risk between scheduled calibrations, many shops implement quick interim checks or verifications to monitor the status of their measuring equipment. These are usually shortened versions of a complete calibration routine and are performed more frequently.

These are general guidelines, and the exact steps—especially the frequency involved in any calibration routine—can vary significantly depending on factors like the specific requirements of the equipment being calibrated, the system or process in which the equipment is being used, and the standards or regulations applicable to the industry or application in question.

Salsbury references the ISO/IEC Guide 99:2007 as a good place to start for anyone interested in learning more, as well as ISO 10012:2003 and ISO/IEC 17025:2017. All are relevant to this discussion.

Calibration Cost and Benefits

The benefits of a robust calibration procedure and schedule go well beyond making good parts.

For example, part accuracy cannot be consistently achieved unless machine shops and sheet metal fabricators follow a documented process.

Doing so is the only way to achieve the precision that manufacturing companies and their customers depend on, since that eliminates any discrepancies in a device’s measurements and assures that it delivers reliable data.

Compliance is also crucial to maintaining long-term relationships with customers. Many contract manufacturers must abide by stringent regulations and quality standards, such as AS9100 for aerospace, ISO 13485 for medical device manufacturers, and ISO 9001 for general quality management.

Regular calibration of measuring devices forms an essential part of these compliance requirements.

Consistency goes hand in hand with accuracy and is a primary goal of any calibration process. Uniform measurements lead to fewer production variances, resulting in less waste and a higher rate of customer acceptance.

In all stages of manufacturing, from design to quality control, calibration ensures that the data collected is consistent and dependable.

While cost considerations shouldn’t play a role in the equipment calibration discussion, the fact remains that the process takes considerable time and effort.

The bottom line? Both time and effort are well spent, since eliminating dimensional surprises on the shop floor spells greater profitability, less reworking and scrapping, and above all, better relations with customers.

Which parts of the calibration process do you find most challenging? Tell us in the comments below.

Talk to Us!

Leave a reply

Your email address will not be published. Required fields are marked *

MSC

Signing into Better MRO is easy. Use your MSCdirect.com username / password, or register to create an account. We’ll bring you back here as soon as you’re done.

Redirecting you in 5 seconds