Search papers, labs, and topics across Lattice.
This paper introduces a latent Trend ID framework for few-shot adaptation in non-stationary robotic environments, addressing the challenge of concept shift without modifying model parameters. The method estimates a low-dimensional environmental state (Trend ID) via backpropagation, using temporal regularization and a state transition model to ensure smooth latent space evolution and prevent overfitting. Experiments on a food grasping task demonstrate successful adaptation to unseen environments, with learned Trend IDs exhibiting temporal consistency and distinct latent space regions.
Forget fine-tuning: this method adapts robots to changing environments by learning a low-dimensional "Trend ID" embedding, keeping the core model fixed.
Robotic systems operating in real-world environments often suffer from concept shift, where the input-output relationship changes due to latent environmental factors that are not directly observable. Conventional adaptation methods update model parameters, which may cause catastrophic forgetting and incur high computational cost. This paper proposes a latent Trend ID-based framework for few-shot adaptation in non-stationary environments. Instead of modifying model weights, a low-dimensional environmental state, referred to as the Trend ID, is estimated via backpropagation while the model parameters remain fixed. To prevent overfitting caused by per-sample latent variables, we introduce temporal regularization and a state transition model that enforces smooth evolution of the latent space. Experiments on a quantitative food grasping task demonstrate that the learned Trend IDs are distributed across distinct regions of the latent space with temporally consistent trajectories, and that few-shot adaptation to unseen environments is achieved without modifying model parameters. The proposed framework provides a scalable and interpretable solution for robotics applications operating across diverse and evolving environments.