On the Niton XRF, calibration factors are used to adjust for consistently high or consistently low readings from the XRF analyzer.
They are calculated by graphing the data points created by plotting the percentage of each element as indicated by standard samples, against the percentage of each element as reported by the XRF analyzer.
Then the linear regression function is used to calculate the slope and intercept for the straight line that best fits these data points. The slope and intercept of this line are the calibration factors.
Calculating calibration factors requires several steps:
1) Measuring standard samples
2) Calculating Calibration factors
3) Adding the Cal factor to the analyzer
Measuring the Standard Samples
Use the XRF Analyzer to take readings of samples for which you already know the composition of the samples. It is important that the known composition of these samples be accurate. These samples provide the baseline to which the XRF Analyzer is adjusted. If the known composition is inaccurate, the XRF analyzer will also be inaccurate.
For each sample, take a reading of 120 seconds. Make a note of the reading numbers for these samples. (You can also store them on the analyzer for download to NDT and Excel)
For each sample, the XRF analyzer reports the percentage by weight for the elements present in the sample. With the calibration factors set to the defaults, these percentages will differ from the known percentages, but are used to calculate the calibration factors.
Calculating Calibration Factors
Using the data that you collected by measuring the standard samples, you now need to plot the percentage of each element as indicated by the standard samples, against the percentage of each element as reported by the XRF Analyzer. Then use the linear regression function to calculate the slope and intercept for a straight line drawn through a graph of those data points.
The slope and intercept for this line are the calibration factors for the element.
You may use any tools that you prefer to make this calculation. This document demonstrates the Excel method.
1. In the first column, enter percentages for the element as reported by the XRF analyzer
2. In the second column, enter percentages for the element as indicated by each standard.
3. Use the cursor to highlight all numbers in both columns, click the chart wizard button.
4. Select the XY (scatter) chart with data points that are not connected by lines (first chart, top diagram)
5. Click Finish
6. Right-click on one of the data points.
7. Click “Add Trendline” on the pop-up menu.
8. On the Type tab, click Linear.
9. On the Options tab, check the boxes for “Display equation on chart” and “Display R-squared value on chart.” Click OK.
10. The equation shows the slope and intercept for the trend line. These are the calibration factors that you enter into the XRF Analyzer.
Slope Intercept
Example: y = 1.3244x + 0.4355
R² = 0.9732
Note: If the intercept is negative in the equation, be sure that you enter the intercept as negative in the XRF Analyzer.
Adding the Calibration Factor to the Analyzer
1. From the Main menu of the Niton XRF Analyzer, select Common Set-Up and then Adjust Calibration.
2. Select the mode for which you wish to change the Calibration Factors.
3. You will see 5 calfactor settings. The factory selection is the default calibration setting. Set 1, 2, 3, 4 are settings where you may input calculated calibration factors.
4. Select a setting for which no calfactors have been added and click Edit.
5. Simply touch the screen on the appropriate elements slope and intercept box to bring up the editing screen and change the default value to the calculated slope and intercept value.
6. Re-name the calfactor appropriately by pressing the keyboard icon and click Save.