emailinfo@calmetrics.in

Frequently Asked Questions (FAQ)

Provided below is a list of Frequently Asked Questions (FAQ's) to answer the most commonly asked questions. If after viewing this page you require a more specific answer or your question is not covered, please feel free to email us for further assistance.

To answer this properly we must first define “traceability”. The definition of metrological traceability which has achieved global acceptance in the metrology community is contained in the International vocabulary of Metrology (VIM), as follows. Traceability is the“property of a measurement result whereby the result can be related to a reference through a documented unbroken chain of calibrations, each contributing to the measurement uncertainty” It is important to note that traceability is the property of the result of a measurement, not of an instrument or calibration report or laboratory. It is not achieved by following any one particular procedure or using special equipment. Merely having an instrument calibrated, even by NIST, is not enough to make the measurement result obtained from that instrument traceable to realizations of appropriate SI units or other stated references. The measurement system by which values are transferred must be clearly understood and under control. Accordingly, the phrase “traceable to NIST”, in its most proper sense, is shorthand for “results of measurements that are traceable to reference standards developed and maintained by NIST”. The phrase “NIST traceability” does not imply in any way acceptance by NIST or any other national metrological institute. It only suggests the laboratory has used a procedure which provides measurement results that can be traced to a NIST reference.

Test report numbers issued by NIST are used solely for administrative purposes. Although they often uniquely identify documents that bear evidence of traceability, test report numbers themselves do not address the traceability requirements, and should not be considered as the sole evidence of traceability. NIST test numbers do not relate to anything other than a particular calibration or test and are of no use to anyone other than the buyer of that particular calibration or test. . NIST test numbers should not be used, nor are required as proof of adequacy or NIST traceability of measurement results. For these reasons, Calmetrics does not provide NIST test numbers for any calibration or test we perform.

Accuracy is the measure of how well a measurement value agrees with an established reference or a physical constant or calculated physical value based on accepted physical or chemical principles. In the case of coating thickness, there are no physical constants or calculations based on theoretical considerations that can be used to evaluate accuracy of coating or film thickness. As such, accuracy is evaluated based on comparisons with other known references. National Metrological Institutes (NMI) such as US NIST make available to industry a limited selection of certified coating thickness reference standards. In the absence of any theoretical evaluations, comparison of Calmetrics standards to congruent standards provided by US NIST and other NMI’s is one way to evaluate absolute accuracy of Calmetrics standards. Howver, no certified reference standards are provided by NIST or other NMIs for most of the coating thickness and material composition standards developed and produced by Calmetrics. Therefore, accuracy is evaluated on the basis of comparisons of thickness values measurement by Calmetrics to other measurement methods which can be used to evaluate coating thickness. As an accredited laboratory, the ISO 17025 standard requires that we perform such comparisons annually to confirm the accuracy of our measurement results. In these cases accuracy is judged by how closely our measurement values agree with measurement values obtained by other measurement methods and/or other laboratories.

Calmetrics endeavors to produce and ship standards which have a clean, appealing surface appearance. However, in some cases, the process needed to produce standards, particularly in the case of some foil materials, does not result in appealing, attractive surfaces. While we are always attempting new production methodologies to overcome these limitations, we can say that what might be viewed as flaws in the surface of standards, such as roughness, minor scratches, wrinkles or waviness in foils or general non-flatness of foils, discolorations caused by chemical processes to produce foils, surface oxidation, will not affect XRF calibrations and measurements in any significant way. It is fortunate that XRF as a test method is relatively unaffected by surface imperfections in the samples being measured. Only severe surface damage may affect XRF coating thickness measurements.

A due Date on your calibration certificate can ONLY be determined by you, the customer. According to ISO 17025, if we, the Calibration Lab, do not receive an answer as to what the due date should be, we MUST NOT choose any date, and leave it blank. Only the customer can decide when the certificate is due for calibration.

The calibration Interval is the time period between the calibration date and the due date for re-calibration. Only you, the customer know how the standards are being handled and therefore only you know how often the standards should be re-calibrated.

Please refer to the “Services” menu in this site. A discussion describing our recertification services addresses this question. We suggest you start with the intervals recommended. Based on the results from our service of your standards, you may then choose to either increase or decrease the recalibration interval. Most of our standards are relatively stable, long term, and with proper handling and storage will not require recertification as often as recommended in the recertification description found in the Services menu of this site.

The standards listed in the quotation or in our published price lists have a description which indicates the ideal or expected thickness of the standard. For example, a standard with part number SAU40 is a gold foil with 40 microinches or 1 micron of Au thickness. When producing any standard, there are deposition variations which are normal and not fully controlled. In other words, while we attempt to deposit 1 micron of gold to fill the order for SAU40, we will produce material that has a thickness range due to process variables. More probable is the case that we produce foil thicknesses that in this case are not exactly 1 micron. As such, the product part number SAU40 actually represents a Au foil standard which is ideally 40 microinches or 1 microns, but may in fact be a little thicker or thinner than 40 microinches or 1 micron. The deposition process is not an exact art. Therefore, material thickness variation from the ideal or nominal thickness is expected. When you order a standard from us, we attempt to provide a thickness which matches most closely to the thickness indicated by the part number.

However, due to the above process variations, we do allow for as much as +/- 15% deviation in the thickness of the standard we provide from the thickness indicated by the part number. Therefore, using the above example of SAU40, we may send you a Au foil which is 0.85 microns or 1.15 microns or any other thickness between these allowable thickness limits. We do this to control costs and therefore prices to you. If you do need a thickness that is closer than 15% of the described thickness value for the standard, you may request it, and a price for that requested value will be determined and quoted to you based on the allowable tolerance you provide us, for the standard thickness. In all cases, the allowable deviation from the nominal standard product thickness does not affect the certified accuracy (normally +/- 5%) of the thickness or composition of the standard.