Search papers, labs, and topics across Lattice.
The paper introduces FCUCR, a federated continual learning framework for user-centric recommendation systems that addresses temporal forgetting and weakened collaborative personalization. To mitigate forgetting, they employ a time-aware self-distillation strategy during local model updates. For improved collaborative personalization under heterogeneous user data, they design an inter-user prototype transfer mechanism to enrich client representations with knowledge from similar users.
Federated recommendation systems can now better adapt to evolving user preferences without sacrificing privacy, thanks to a novel approach that retains historical knowledge and transfers insights between similar users.
User-centric recommendation has become essential for delivering personalized services, as it enables systems to adapt to users'evolving behaviors while respecting their long-term preferences and privacy constraints. Although federated learning offers a promising alternative to centralized training, existing approaches largely overlook user behavior dynamics, leading to temporal forgetting and weakened collaborative personalization. In this work, we propose FCUCR, a federated continual recommendation framework designed to support long-term personalization in a privacy-preserving manner. To address temporal forgetting, we introduce a time-aware self-distillation strategy that implicitly retains historical preferences during local model updates. To tackle collaborative personalization under heterogeneous user data, we design an inter-user prototype transfer mechanism that enriches each client's representation using knowledge from similar users while preserving individual decision logic. Extensive experiments on four public benchmarks demonstrate the superior effectiveness of our approach, along with strong compatibility and practical applicability. Code is available.