Search papers, labs, and topics across Lattice.
Shanghai AI Laboratory, The University of Hong Kong
2
0
5
9
Forget short-term context windows: VPWEM's Transformer-based memory compressor lets robots ace long-horizon manipulation tasks by distilling past observations into fixed-size episodic memories.
Bimanual robots can now achieve robust dexterous grasping in the real world, thanks to a massive 20M-frame synthetic dataset and a simple attention-based policy that transfers surprisingly well.