Calibration vs. Verification
In a metal detector audit, quality control personnel need to understand the difference between calibration and verification. Metal detectors are not calibratable. That is, there is no standard to which a metal detector can be set. That might sound odd, because we use metal detectors to find a specified size of metal, whether the metal is ferrous, non-ferrous or stainless steel. If the right settings are chosen, we might expect that the detector locate any given size of metal. In other words, if we set it with one group of settings it should detect (or achieve) 3.0mm ferrous. Change the settings and you should be able to detect 1.5mm ferrous. We expect that altering the settings on a detector will change the size metal it is capable of detecting.
But it’s not like calibrating a scale. With a scale, you take a standard weight, perhaps 1 lb and adjust the scale so that it reads 1 lb. You might send out a scale or other device, say calipers, to a 3rd party vendor who would ensure that the device is calibrated, usually to an internationally accepted standard, and measures appropriately and consistently.
A metal detector, however, has no national or international standard to which it can be “calibrated.” That’s because there is a wide variety of variables that affect a detectors capabilities. Most important among those fluctuations is the product itself. But before we tackle that problem, let’s take a brief look at several other factors, including:
While all these are important considerations, these are only some of the factors involved in calibrating a metal detector. All things being considered, the term calibration applies, ultimately, to the relationship between the metal detector and the product. Once the proper size aperture is available and each of these factors have been settled, a metal detector is “calibrated” with clean/non-contaminated product such that the product does not cause any effect on the metal detector. In simple terms, run a clean product through your detector and it shouldn’t reject the product. The “Product Effect” is being eliminated from the testing process. Product effect is the magnetic and conductive properties of a product. As the product passes through the aperture, it will affect the coils used in the detection process. Metal detectors must factor this and eliminate it or ignore it. During setup, the detector needs to “learn” what is the product effect. The detector will discover this (along with the other factors above) and can then be set to a baseline – a setting in which the product (a clean product) and it’s container (paper, cardboard or other non-magnetic housing) moves through the detector without setting off a detection alarm and the associated reject device.
When it comes to an audit, many people will ask an auditor to come in and “calibrate” their detectors. An auditor can assist with that, as explained, but that’s not what’s done during an audit. An auditors job is to “verify” that the metal detector can achieve the specifications (usually of a HACCP plan) that the quality control department needs it to achieve.
In a typical test:
So if the company (customer) has already established that they need to achieve 1.5mm Ferrous, 2.0mm Non-Ferrous and 3.0mm Stainless Steel, then the auditor, using a procedure outlined above, will “verify” that standard. If the product changes, the standards achievable could change, so the verification only applies to the products tested on that metal detection system. A change in the product will require that the detector be re-calibrated for that product and then the auditor can verify it meets the standards intended. Understanding the terminology means clarifying what you do on your production line and how it’s done. And in the end analysis, that makes for a safer product in the marketplace.