-
FedDIP: Federated Learning with Extreme Dynamic Pruning and Incremental Regularization
Paper • 2309.06805 • Published • 1 -
Does Federated Learning Really Need Backpropagation?
Paper • 2301.12195 • Published -
FEDZIP: A Compression Framework for Communication-Efficient Federated Learning
Paper • 2102.01593 • Published -
A Reputation Mechanism Is All You Need: Collaborative Fairness and Adversarial Robustness in Federated Learning
Paper • 2011.10464 • Published
Xiaomo
Ruoshuili
AI & ML interests
None yet
Recent Activity
updated
a collection
9 days ago
FL updated
a collection
9 days ago
FL updated
a collection
9 days ago
FL Organizations
None yet
Learning
-
Exploring Reasoning Reward Model for Agents
Paper • 2601.22154 • Published • 23 -
Group-Evolving Agents: Open-Ended Self-Improvement via Experience Sharing
Paper • 2602.04837 • Published • 8 -
Agent Skills: A Data-Driven Analysis of Claude Skills for Extending Large Language Model Functionality
Paper • 2602.08004 • Published • 5 -
SEAD: Self-Evolving Agent for Multi-Turn Service Dialogue
Paper • 2602.03548 • Published • 4
FL
-
FedDIP: Federated Learning with Extreme Dynamic Pruning and Incremental Regularization
Paper • 2309.06805 • Published • 1 -
Does Federated Learning Really Need Backpropagation?
Paper • 2301.12195 • Published -
FEDZIP: A Compression Framework for Communication-Efficient Federated Learning
Paper • 2102.01593 • Published -
A Reputation Mechanism Is All You Need: Collaborative Fairness and Adversarial Robustness in Federated Learning
Paper • 2011.10464 • Published
Learning
-
Exploring Reasoning Reward Model for Agents
Paper • 2601.22154 • Published • 23 -
Group-Evolving Agents: Open-Ended Self-Improvement via Experience Sharing
Paper • 2602.04837 • Published • 8 -
Agent Skills: A Data-Driven Analysis of Claude Skills for Extending Large Language Model Functionality
Paper • 2602.08004 • Published • 5 -
SEAD: Self-Evolving Agent for Multi-Turn Service Dialogue
Paper • 2602.03548 • Published • 4
models 0
None public yet
datasets 0
None public yet