Abstract:
In the process of spent fuel reprocessing, high-precision measurement of uranium (U) and plutonium (Pu) concentrations is a critical prerequisite for ensuring process stability and nuclear safeguards compliance. Hybrid L-edge/X-ray fluorescence (XRF) densitometry (HLED) integrates L-edge densitometry (LED) with XRF analysis, boasting distinct advantages including rapid detection, high measurement precision, and non-destructive testing capabilities. Existing research predominantly employs non-radioactive elements (e.g., lead, gold) as surrogates for U and Pu. However, these substitutes cannot fully replicate the spectral interference and matrix effects inherent to actual nuclear materials. To address the limitations of insufficient similarity between surrogate nuclides and the lack of standardized measurement protocols for mixed samples in current HLED studies, a comprehensive and robust HLED measurement methodology was established in this paper. Considering its comparable chemical properties to U and Pu and low radioactivity, thorium (Th) was selected as the chemical analog to optimize measurement parameters, develop dedicated analytical algorithms, and validate the precision and accuracy of the HLED system. A series of aqueous Th standard samples (1-200 g/L) and U-Th mixed samples (concentration ratio of 100∶1) were prepared using Th as the simulated nuclide. Based on a laboratory-scale HLED setup, measurement parameters—specifically X-ray tube voltage and current—were optimized to align the reference spectral peak with the U L
Ⅲ absorption edge. For data processing in the LED branch, regions of interest (ROI) were determined
via differential analysis of the ln(ln(1/
T))-ln
E curve, and two transmission rate calculation methods (the endpoint method and extrapolation method) were compared. For the XRF branch, calibration curves were established using three approaches: total peak area, net peak area, and specific scattering ratio. The relationship between the calibration factor (
RU/Th) and uranium concentration was established to analyze U-Th mixed samples. Experimental results show that the optimal measurement parameters are an X-ray tube voltage of 26.1 kV and a current of 78.4 μA. Under these conditions, the reference spectrum peak energy closely matches the uranium L
Ⅲ absorption edge (17.17 keV), effectively extending the upper limit of concentration measurement. The optimal ROIs for the LED branch are 15.79-16.92 keV (low-energy side) and 17.45-19.09 keV (high-energy side). For pure Th samples, the endpoint method exhibits superior precision over the extrapolation method, yielding a relative standard deviation (RSD) of 0.22% for 50 g/L samples. In contrast, the extrapolation method delivers better accuracy for U-Th mixed samples, with the relative error of uranium concentration measurement below 0.41%. Moreover, the XRF branch achieves excellent linearity
via the specific scattering ratio method (
R2=0.999 7). By combining uranium concentration data from the LED branch and the concentration ratio from the XRF branch, the relative error of thorium concentration measurement remains under 0.61%. This study successfully establishes a systematic HLED measurement method for U-Th mixed systems. Results confirm that optimized parameter settings and the integrated LED/XRF algorithm significantly improve measurement accuracy and stability. Comparative analysis indicates that the endpoint method ensures higher precision for single-element analysis, whereas the extrapolation method is more robust for mixed matrices. This work resolves key technical issues in surrogate selection and mixed-sample analysis, laying a solid experimental foundation for the engineering application of HLED technology in on-site monitoring of uranium and plutonium concentrations at spent fuel reprocessing plants.