PulseAugur
LIVE 09:08:29
research · [19 sources] ·
0
research

New federated learning methods tackle data heterogeneity and scalability challenges

Researchers have developed several new methods to improve federated learning, a distributed machine learning approach that trains models on decentralized data without sharing raw information. FedHarmony addresses challenges in modeling label correlations across heterogeneous client data by introducing a consensus mechanism. "Who Trains Matters" tackles selection biases in federated learning by proposing an inverse-probability-weighted aggregation scheme to ensure training representativeness. Additionally, new techniques like Subspace Optimization (SSF), FedSLoP, and GradsSharding aim to enhance efficiency by reducing communication and memory overhead, particularly for large models on serverless platforms. AI

Summary written by gemini-2.5-flash-lite from 19 sources. How we write summaries →

IMPACT New federated learning algorithms promise improved efficiency and accuracy, especially for large models and heterogeneous data.

RANK_REASON Multiple new research papers detailing novel algorithms and frameworks for federated learning.

Read on Practical AI →

New federated learning methods tackle data heterogeneity and scalability challenges

COVERAGE [19]

  1. arXiv cs.LG TIER_1 · Zhiqiang Kou, Junxiang Wu, Wenke Huang, Wenwen He, Ming-Kun Xie, Changwei Wang, Yuheng Jia, Di Jiang, Yang Liu, Xin Geng, Qiang Yang ·

    FedHarmony: Harmonizing Heterogeneous Label Correlations in Federated Multi-Label Learning

    arXiv:2604.28024v1 Announce Type: new Abstract: Federated Multi-Label Learning is a distributed paradigm where multiple clients possess heterogeneous multi-label data and perform collaborative learning under privacy constraints without sharing raw data. However, modeling label co…

  2. arXiv cs.LG TIER_1 · Qiang Yang ·

    FedHarmony: Harmonizing Heterogeneous Label Correlations in Federated Multi-Label Learning

    Federated Multi-Label Learning is a distributed paradigm where multiple clients possess heterogeneous multi-label data and perform collaborative learning under privacy constraints without sharing raw data. However, modeling label correlations under heterogeneous distributions rem…

  3. arXiv cs.LG TIER_1 · Gota Morishita ·

    Who Trains Matters: Federated Learning under Enrollment and Participation Selection Biases

    arXiv:2604.26604v1 Announce Type: new Abstract: Federated learning (FL) trains a shared model from updates contributed by distributed clients, often implicitly assuming that contributing clients are representative of the target population. In practice, this representativeness ass…

  4. arXiv cs.LG TIER_1 · Gota Morishita ·

    Who Trains Matters: Federated Learning under Enrollment and Participation Selection Biases

    Federated learning (FL) trains a shared model from updates contributed by distributed clients, often implicitly assuming that contributing clients are representative of the target population. In practice, this representativeness assumption can fail at two distinct stages, inducin…

  5. arXiv cs.LG TIER_1 · Shuchen Zhu, Zhengyang Huang, Yuqi Xu, Peijin Li ·

    Subspace Optimization for Efficient Federated Learning under Heterogeneous Data

    arXiv:2604.25467v1 Announce Type: new Abstract: Federated learning increasingly operates in a large-model regime where communication, memory, and computation are all scarce. Typically, non-IID client data induce drift that degrades the stability and performance of local training.…

  6. arXiv cs.LG TIER_1 · Peijin Li ·

    Subspace Optimization for Efficient Federated Learning under Heterogeneous Data

    Federated learning increasingly operates in a large-model regime where communication, memory, and computation are all scarce. Typically, non-IID client data induce drift that degrades the stability and performance of local training. Existing remedies such as SCAFFOLD introduce he…

  7. arXiv cs.LG TIER_1 · Yutong He, Zhengyang Huang, Jiahe Geng ·

    FedSLoP: Memory-Efficient Federated Learning with Low-Rank Gradient Projection

    arXiv:2604.24012v1 Announce Type: new Abstract: Federated learning enables a population of clients to collaboratively train machine learning models without exchanging their raw data, but standard algorithms such as FedAvg suffer from slow convergence and high communication and me…

  8. arXiv cs.LG TIER_1 · Taehwan Yoon, Bongjun Choi, Wesley De Neve ·

    FedRef: Bayesian Fine-Tuning using a Reference Model to Mitigate Catastrophic Forgetting for Heterogeneous Federated Learning

    arXiv:2506.23210v5 Announce Type: replace Abstract: Federated learning (FL) enables collaborative model training across distributed clients while preserving data privacy. However, data and system heterogeneity often cause catastrophic forgetting and unbounded drift in model updat…

  9. arXiv cs.AI TIER_1 · Amine Barrak ·

    Shard the Gradient, Scale the Model: Serverless Federated Aggregation via Gradient Partitioning

    arXiv:2604.22072v1 Announce Type: cross Abstract: Federated learning (FL) aggregation on serverless platforms faces a hard scalability ceiling: existing architectures (lambda-FL, LIFL) partition clients across aggregators, but every aggregator must hold the complete model gradien…

  10. arXiv cs.AI TIER_1 · Amine Barrak ·

    Shard the Gradient, Scale the Model: Serverless Federated Aggregation via Gradient Partitioning

    Federated learning (FL) aggregation on serverless platforms faces a hard scalability ceiling: existing architectures (lambda-FL, LIFL) partition clients across aggregators, but every aggregator must hold the complete model gradient in memory. When gradients exceed the per-functio…

  11. Hugging Face Daily Papers TIER_1 ·

    Decision-Focused Federated Learning Under Heterogeneous Objectives and Constraints

    We consider what we refer to as {Decision-Focused Federated Learning (DFFL)} framework, i.e., a predict-then-optimize approach employed by a collection of agents, where each agent's predictive model is an input to a downstream linear optimization problem, and no direct exchange o…

  12. arXiv cs.CV TIER_1 · Mahad Ali, Laura J. Brattain ·

    FMCL: Class-Aware Client Clustering with Foundation Model Representations for Heterogeneous Federated Learning

    arXiv:2604.27510v1 Announce Type: cross Abstract: Federated Learning (FL) enables collaborative model training across distributed clients without sharing raw data, yet its performance deteriorates under statistical heterogeneity. Clustered Federated Learning addresses this challe…

  13. arXiv cs.CV TIER_1 · Laura J. Brattain ·

    FMCL: Class-Aware Client Clustering with Foundation Model Representations for Heterogeneous Federated Learning

    Federated Learning (FL) enables collaborative model training across distributed clients without sharing raw data, yet its performance deteriorates under statistical heterogeneity. Clustered Federated Learning addresses this challenge by grouping similar clients and training separ…

  14. arXiv cs.CV TIER_1 · Emre Ard{\i}\c{c}, Yakup Gen\c{c} ·

    Sample Selection Using Multi-Task Autoencoders in Federated Learning with Non-IID Data

    arXiv:2604.26116v1 Announce Type: new Abstract: Federated learning is a machine learning paradigm in which multiple devices collaboratively train a model under the supervision of a central server while ensuring data privacy. However, its performance is often hindered by redundant…

  15. arXiv stat.ML TIER_1 · Alexander Vinel ·

    Decision-Focused Federated Learning Under Heterogeneous Objectives and Constraints

    We consider what we refer to as {Decision-Focused Federated Learning (DFFL)} framework, i.e., a predict-then-optimize approach employed by a collection of agents, where each agent's predictive model is an input to a downstream linear optimization problem, and no direct exchange o…

  16. arXiv stat.ML TIER_1 · Xiaolei Fang ·

    Heterogeneity-Aware Personalized Federated Learning for Industrial Predictive Analytics

    Federated prognostics enable clients (e.g., companies, factories, and production lines) to collaboratively develop a failure time prediction model while keeping each client's data local and confidential. However, traditional federated models often assume homogeneity in the degrad…

  17. Practical AI TIER_1 · Practical AI LLC ·

    Friendly federated learning 🌼

    <p>This episode is a follow up to our recent Fully Connected <a href="https://practicalai.fm/153">show discussing federated learning</a>. In that previous discussion, we mentioned <a href="https://flower.dev/">Flower</a> (a “friendly” federated learning framework). Well, one of t…

  18. Practical AI TIER_1 · Practical AI LLC ·

    Federated Learning 📱

    <p>Federated learning is increasingly practical for machine learning developers because of the challenges we face with model and data privacy. In this fully connected episode, Chris and Daniel dive into the topic and dissect the ideas behind federated learning, practicalities of …

  19. Mastodon — fosstodon.org TIER_1 Русский(RU) · [email protected] ·

    Federated Learning with Memory Constraints on Edge Devices. Part 2 How to Train ML Models on Edge Devices with <256MB Memory? Hello, Habr! I am Ale

    Федеративное обучение в условиях дефицита памяти на Edge-устройствах. Часть 2 Как обучить ML-модели на Edge-устройствах с памятью <256 МБ? Привет, Хабр! Я — Александр Лошкарев, инженер-программист, и это вторая часть материала о федеративном обучении. В https:// habr.com/ru/compa…