Search papers, labs, and topics across Lattice.
Institute of Information Engineering, Chinese Academy of Sciences
3
0
4
15
Forget scaling depth and width—MOUE unlocks a new "virtual width" dimension for Mixture-of-Experts by cleverly reusing a single expert pool across layers.
Ditch the training data: S2CDR achieves state-of-the-art cross-domain recommendation by smoothing and sharpening user-item interactions with ODEs, all without any training.
Diffusion models can now generate user preferences for multi-behavior sequential recommendation, outperforming traditional methods by better capturing uncertainty and enabling more diverse recommendations.