Abstract:To address the limitations of existing theoretical line loss rate calculation methods, namely, the low efficiency and difficulty of online application in physics-based models, and the lack of interpretability in data-driven approaches that hinders their engineering deployment, this paper proposes an data-driven interpretable method for calculating theoretical line loss rates in low-voltage distribution areas. First, relevant literature on the physical mechanisms affecting theoretical line loss rate is systematically reviewed, and key influencing features are identified. Second, a calculation model is constructed with the light gradient boosting machine (LightGBM) as its core, and an improved grey wolf optimizer (IGWO) is employed for hyperparameter optimization to enhance model accuracy. Furthermore, the shapley additive explanations (SHAP) method is introduced to quantify the contribution of each feature to the calculated results, thereby revealing whether the model’s decision logic aligns with the underlying physical mechanisms of line losses. Finally, case studies based on real-world distribution area datasets demonstrate the effectiveness of the proposed method.