Search papers, labs, and topics across Lattice.
2
0
5
The mix of batch and inference workloads in AI data centers creates surprising power dynamics, where smoothing aggregate power demand paradoxically *increases* short-horizon ramping.
Forget scaling data – scaling the *number of tasks* unlocks surprisingly sample-efficient humanoid control via multi-task model-based RL.