浙江工商大学信息与电子工程学院,浙江 杭州 310018
[ "齐霄超(2001- ),女,浙江工商大学信息与电子工程学院硕士生,主要研究方向为个性化联邦学习。" ]
[ "倪郑威(1989- ),男,博士,浙江工商大学信息与电子工程学院副研究员、硕士生导师,主要研究方向为人工智能、无线网络。" ]
修回:2025-06-23,
录用:2025-07-07,
网络出版:2026-01-06,
移动端阅览
齐霄超,倪郑威.FedDACS: 基于分布感知可控采样优化的个性化群体知识迁移[J].电信科学,
QI Xiaochao,NI Zhengwei.FedDACS: personalized knowledge transfer based on distribution-aware controlled sampling optimization[J].Telecommunications Science,
齐霄超,倪郑威.FedDACS: 基于分布感知可控采样优化的个性化群体知识迁移[J].电信科学, DOI:10.11959/j.issn.1000−0801.2026014.
QI Xiaochao,NI Zhengwei.FedDACS: personalized knowledge transfer based on distribution-aware controlled sampling optimization[J].Telecommunications Science, DOI:10.11959/j.issn.1000−0801.2026014.
个性化联邦学习作为分布式机器学习领域的前沿范式,通过为各客户端训练专属模型来应对数据异构性挑战。现有研究采用群体知识迁移技术,在服务器端部署共享预测器提升模型在基类上的泛化能力,在本地更新特征提取器提升模型个性化能力。但仍存在两方面显著缺陷:其一,全局共享的预测器难以适配高度异构的客户端分布,导致准确率下降;其二,知识迁移过程中易受分布差异干扰,出现负迁移效应。为此,提出了分布感知可控采样优化的个性化群体知识迁移框架(FedDACS)。该框架将服务器端传统单一共享预测器改为个性化预测器阵列,每个客户端对应专属预测器模块,与本地特征提取器构成个性化模型。此外,引入了分布感知可控采样(DACS)技术,通过实时分析各客户端数据分布,动态调整特征空间采样策略,控制个性化预测器的输入。该框架可有效抑制分布差异导致的负迁移效应。为了验证FedDACS的有效性,在CIFAR-10与CIFAR-100数据集上构建了7种非独立同分布(Non-IID)场景进行系统性验证。实验结果表明,相较于Fedgkt、FedProto等8种基准方法,FedDACS在CIFAR-10数据集上实现平均用户准确率提升幅度达6.85%,在CIFAR-100数据集上提升幅度达5.28%。验证了所提FedDACS方法能够有效提升个性化模型在不同数据异质性场景下的性能。
Personalized federated learning
as a cutting-edge paradigm in the field of distributed machine learning
is developed to address the challenges of data heterogeneity by training exclusive models for each client. Existing research employes population knowledge transfer techniques
where a shared predictor is deployed on the server to enhance the generalization ability of the model over the base class
while local feature extractors are updated to improve the personalization ability of the model. However
two significant drawbacks remain. Firstly
the globally shared predictor struggles to adapt to highly heterogeneous client distributions
resulting in decreased accuracy. Secondly
the knowledge transfer process is easily disturbed by distribution differences
leading to negative transfer effects. To address these issues
a distribution-aware controlled sampling optimization framework for personalized population knowledge transfer (FedDACS) was proposed. The traditional single shared predictor on the server was transformed into an array of personalized predictors
with each client corresponding to an exclusive predictor module
forming a personalized model alongside local feature extractors. Furthermore
a distribution-aware controlled sampling technique (DACS) was introduced
which dynamically adjusted the feature space sampling strategy by analyzing the data distributions of each client in real-time
thereby controlling the input of personalized predictors. The negative transfer effects caused by distribution differences were effectively mitigated. To validate the effectiveness of FedDACS
seven Non-IID scenarios were constructed on CIFAR-10 and CIFAR-100 datasets for systematic validation. Experimental results demonstrate that
compared to eight baseline methods such as Fedgkt and FedProto
FedDACS achieves an average user accuracy improvement of 6.85% on CIFAR-10 and an enhancement of 5.28% on CIFAR-100. These results indicate that the proposed FedDACS method can effectively improve the performance of personalized models in various data heterogeneity scenarios.
TAN A Z , YU H , CUI L Z , et al . Towards personalized federated learning [J ] . IEEE Transactions on Neural Networks and Learning Systems , 2023 , 34 ( 12 ): 9587 - 9603 .
FALLAH A , MOKHTARI A , OZDAGLAR A E . Personalized federated learning with theoretical guarantees: a model-agnostic meta-learning approach [C ] // Proceedings of the 2020 Neural Information Processing Systems . Montreal : Neural Information Processing Systems , 2020 .
ZHAO Y , LI M , LAI L Z , et al . Federated learning with non-IID data [EB ] . 2018 .
YANG M , WANG X M , ZHU H B , et al . Federated learning with class imbalance reduction [C ] // Proceedings of the 2021 29th European Signal Processing Conference (EUSIPCO) . Piscataway : IEEE Press , 2021 : 2174 - 2178 .
LI T , SAHU A K , ZAHEER M , et al . Federated optimization in heterogeneous networks [C ] // Proceedings of the 2020 Machine Learning and Systems , 2020 , 2 : 429 - 450 .
LI D L , WANG J P . FedMD: heterogenous federated learning via model distillation [EB ] . 2019 .
LIANG P P , LIU T , LIU Z Y , et al . Think locally, act globally: federated learning with local and global representations [EB ] . 2020 .
SMITH V , CHIANG C K , SANJABI M , et al . Federated multi-task learning [C ] // Proceedings of the 2025 Neural Information Processing Systems . Montreal : Neural Information Processing Systems , 2025 .
DENG Y Y , KAMANI M M , MAHDAVI M . Adaptive personalized federated learning [EB ] . 2020 .
LIN T , KONG L J , STICH S U , et al . Ensemble distillation for robust model fusion in federated learning [EB ] . 2020 .
CHEN H C , JOHNNY , WANG , et al . The best of both worlds: accurate global and personalized models through federated learning with data-free hyper-knowledge distillation [EB ] . 2023 .
HE C Y , ANNAVARAM M , AVESTIMEHR S . Group knowledge transfer: federated learning of large CNNs at the edge [EB ] . 2020 .
WU Z Y , SUN S , WANG Y W , et al . Exploring the distributed knowledge congruence in proxy-data-free federated distillation [J ] . ACM Transactions on Intelligent Systems and Technology , 2024 , 15 ( 2 ): 1 - 34 .
WU Z Y , SUN S , WANG Y W , et al . FedICT: federated multi-task distillation for multi-access edge computing [J ] . IEEE Transactions on Parallel and Distributed Systems , 2024 , 35 ( 6 ): 1107 - 1121 .
ITAHARA S , NISHIO T , KODA Y , et al . Distillation-based semi-supervised federated learning for communication-efficient collaborative training with non-IID private data [J ] . IEEE Transactions on Mobile Computing , 2023 , 22 ( 1 ): 191 - 205 .
CHANG H Y , SHEJWALKAR V , SHOKRI R , et al . Cronus: robust and heterogeneous collaborative learning with black-box knowledge transfer [EB ] . 2019 .
CHO Y J , WANG J Y , CHIRUVOLU T , et al . Personalized federated learning for heterogeneous clients with clustered knowledge transfer [EB ] . 2021 .
ZHANG J , GUO S , MA X , et al . Parameterized knowledge transfer for personalized federated learning [C ] // Proceedings of the 2021 Neural Information Processing Systems . Montreal : Neural Information Processing Systems , 2021 , 34 : 10092 - 10104 .
CHENG S J , WU J W , XIAO Y H , et al . FedGEMS: federated learning of larger server models via selective knowledge fusion [EB ] . 2021 .
XIE C L , HUANG D A , CHU W D , et al . Perada: parameter-efficient federated learning personalization with generalization guarantees [C ] // Proceedings of the 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) . Piscataway : IEEE Press , 2024 : 23838 - 23848 .
WU Z Y , SUN S , WANG Y W , et al . FedCache: a knowledge cache-driven federated learning architecture for personalized edge intelligence [J ] . IEEE Transactions on Mobile Computing , 2024 , 23 ( 10 ): 9368 - 9382 .
TAN Y , LONG G D , LIU L , et al . FedProto: federated prototype learning across heterogeneous clients [J ] . Proceedings of the AAAI Conference on Artificial Intelligence. Menlo Park: AAAI Press , 2022 , 36 ( 8 ): 8432 - 8440 .
CHEN Y Q , LU W , QIN X , et al . MetaFed: federated learning among federations with cyclic knowledge distillation for personalized healthcare [J ] . IEEE Transactions on Neural Networks and Learning Systems , 2024 , 35 ( 11 ): 16671 - 16682 .
庄福振 , 罗平 , 何清 , 等 . 迁移学习研究进展 [J ] . 软件学报 , 2015 , 26 ( 1 ): 26 - 39 .
ZHUANG F Z , LUO P , HE Q , et al . Survey on transfer learning research [J ] . Journal of Software , 2015 , 26 ( 1 ): 26 - 39 .
LEE G , JEONG M , SHIN Y , et al . Preservation of the global knowledge by not-true distillation in federated learning [EB ] . 2021 .
HE Y T , CHEN Y Q , YANG X D , et al . Class-wise adaptive self distillation for federated learning on non-IID data (student abstract) [C ] // Proceedings of the 2022 AAAI Conference on Artificial Intelligence . Menlo Park : AAAI Press , 2022 , 36 ( 11 ): 12967 - 12968 .
KARIMIREDDY S P , KALE S , MOHRI M , et al . Scaffold: stochastic controlled averaging for federated learning [C ] // Proceedings of the 2020 Conference and Workshop on Neural Information Processing Systems . Montreal : Neural Information Processing Systems , 2020 : 5132 - 5143 .
YAO D Z , PAN W N , DAI Y T , et al . Local-global knowledge distillation in heterogeneous federated learning with non-IID data [EB ] . 2021 .
LI Q B , DIAO Y Q , CHEN Q , et al . Federated learning on non-IID data silos: an experimental study [C ] // Proceedings of the 2022 IEEE 38th International Conference on Data Engineering (ICDE) . Piscataway : IEEE Press , 2022 : 965 - 978 .
NGUYEN Q , PHAM H H , WONG K S , et al . FedDCT: federated learning of large convolutional neural networks on resource-constrained devices using divide and collaborative training [J ] . IEEE Transactions on Network and Service Management , 2024 , 21 ( 1 ): 418 - 436 .
ZHANG J Q , HUA Y , WANG H , et al . GPFL: simultaneously learning global and personalized feature information for personalized federated learning [C ] // Proceedings of the 2023 IEEE/CVF International Conference on Computer Vision (ICCV) . Piscataway : IEEE Press , 2023 : 5018 - 5028 .
MILLS J , HU J , MIN G Y . Multi-task federated learning for personalised deep neural networks in edge computing [J ] . IEEE Transactions on Parallel and Distributed Systems , 2022 , 33 ( 3 ): 630 - 641 .
0
浏览量
1
下载量
0
CSCD
关联资源
相关文章
相关作者
相关机构
京公网安备11010802024621