Smooth OWA in Federated Learning: a Newton-Cotes quadrature-inspired aggregation framework
Więcej
Ukryj
1
Lublin University of Technology
Data publikacji: 29-08-2025
Adv. Sci. Technol. Res. J. 2025;
SŁOWA KLUCZOWE
DZIEDZINY
STRESZCZENIE
This paper presents a novel approach to federated learning based on the Smooth Ordered Weighted Averaging (OWA) operator which enables flexible and context-sensitive weighting of local models during the aggregation process. To enhance the precision of the aggregated weight computations, we incorporate numerical quadrature-inspired techniques, allowing for a more accurate representation of individual client contributions to the global model. Specifically, the approach utilizes classical OWA and several smoothed variants derived from Newton-Cotes quadratures, including the 3/8 rule, trapezoidal rule, and ONC4 (4-point open Newton-Cotes) formula. The study compares federated learning models using standard weight averaging against those incorporating both classical and smoothed OWA operators. This evaluation provides insight into how the smoothing mechanisms influence aggregation quality and final model accuracy. A neural network comprising several dense layers served as the classification model in the Federated Learning framework. Two experimental scenarios were considered: one where data was evenly distributed across local clients, and another with non-uniform data distribution to reflect real-world heterogeneity. Various strategies for extracting the OWA weights were explored, including performance-based weighting determined by the accuracy of local models during preliminary training rounds The proposed methodology has been tested on small-scale image datasets such as MNIST and it has demonstrated improved classification accuracy value compared to traditional Federated Learning approaches using simple averaging.