混合式L边界密度计技术的实验方法研究

Study on Experimental Methods of Hybrid L-edge/L-XRF Densitometry Technology

  • 摘要: 在乏燃料后处理中,铀钚浓度的高精度测量对工艺稳定运行至关重要。混合式L边界密度计(hybrid L-edge/L-XRF densitometry,HLED)技术结合L边界密度计(LED)与X射线荧光(XRF)技术,具有分析速度快、精密度高、无损等特点,适用于铀钚浓度测量。为解决现有HLED研究中替代核素贴近性不足、混合样品测量方法缺失的问题,本文以化学性质与铀相近、放射性低的钍为模拟核素,基于实验室HLED装置,通过配制系列水相钍标准样品及铀钍混合样品,完成了测量参数优化、工作方法建立及精密度与准确度验证。结果表明:最优测量参数为X光管管压26.1 kV、管流78.4 μA,此时参考谱峰值能量贴近铀L吸收边(17.17 keV),可提升浓度测量上限;通过对ln(ln(1/T))-ln E曲线进行微分分析,确定LED支路最佳感兴趣区(ROI,低能侧用ROIL表示,高能侧用ROIH表示)为ROIL=15.79~16.92 keV、ROIH=17.45~19.09 keV;纯钍样品测量中,端点法精密度优于外推法,50 g/L样品的相对标准偏差(RSD)达0.22%;XRF支路采用特散比法建立的工作曲线线性拟合优度R2=0.999 7;铀钍混合样品分析中,外推法准确度更优,铀浓度测量相对偏差<0.41%,结合XRF测得的浓度比计算钍浓度,相对偏差<0.61%。本研究建立的HLED测量方法为乏燃料后处理铀钚浓度分析的工程化应用奠定了实验基础。

     

    Abstract: In the process of spent fuel reprocessing, high-precision measurement of uranium (U) and plutonium (Pu) concentrations is a critical prerequisite for ensuring process stability and nuclear safeguards compliance. Hybrid L-edge/X-ray fluorescence (XRF) densitometry (HLED) integrates L-edge densitometry (LED) with XRF analysis, boasting distinct advantages including rapid detection, high measurement precision, and non-destructive testing capabilities. Existing research predominantly employs non-radioactive elements (e.g., lead, gold) as surrogates for U and Pu. However, these substitutes cannot fully replicate the spectral interference and matrix effects inherent to actual nuclear materials. To address the limitations of insufficient similarity between surrogate nuclides and the lack of standardized measurement protocols for mixed samples in current HLED studies, a comprehensive and robust HLED measurement methodology was established in this paper. Considering its comparable chemical properties to U and Pu and low radioactivity, thorium (Th) was selected as the chemical analog to optimize measurement parameters, develop dedicated analytical algorithms, and validate the precision and accuracy of the HLED system. A series of aqueous Th standard samples (1-200 g/L) and U-Th mixed samples (concentration ratio of 100∶1) were prepared using Th as the simulated nuclide. Based on a laboratory-scale HLED setup, measurement parameters—specifically X-ray tube voltage and current—were optimized to align the reference spectral peak with the U L absorption edge. For data processing in the LED branch, regions of interest (ROI) were determined via differential analysis of the ln(ln(1/T))-ln E curve, and two transmission rate calculation methods (the endpoint method and extrapolation method) were compared. For the XRF branch, calibration curves were established using three approaches: total peak area, net peak area, and specific scattering ratio. The relationship between the calibration factor (RU/Th) and uranium concentration was established to analyze U-Th mixed samples. Experimental results show that the optimal measurement parameters are an X-ray tube voltage of 26.1 kV and a current of 78.4 μA. Under these conditions, the reference spectrum peak energy closely matches the uranium L absorption edge (17.17 keV), effectively extending the upper limit of concentration measurement. The optimal ROIs for the LED branch are 15.79-16.92 keV (low-energy side) and 17.45-19.09 keV (high-energy side). For pure Th samples, the endpoint method exhibits superior precision over the extrapolation method, yielding a relative standard deviation (RSD) of 0.22% for 50 g/L samples. In contrast, the extrapolation method delivers better accuracy for U-Th mixed samples, with the relative error of uranium concentration measurement below 0.41%. Moreover, the XRF branch achieves excellent linearity via the specific scattering ratio method (R2=0.999 7). By combining uranium concentration data from the LED branch and the concentration ratio from the XRF branch, the relative error of thorium concentration measurement remains under 0.61%. This study successfully establishes a systematic HLED measurement method for U-Th mixed systems. Results confirm that optimized parameter settings and the integrated LED/XRF algorithm significantly improve measurement accuracy and stability. Comparative analysis indicates that the endpoint method ensures higher precision for single-element analysis, whereas the extrapolation method is more robust for mixed matrices. This work resolves key technical issues in surrogate selection and mixed-sample analysis, laying a solid experimental foundation for the engineering application of HLED technology in on-site monitoring of uranium and plutonium concentrations at spent fuel reprocessing plants.

     

/

返回文章
返回