Search papers, labs, and topics across Lattice.
RADAR is introduced as a fully autonomous, closed-loop data generation engine for robotics, eliminating human intervention by using a four-module pipeline involving a Vision-Language Model for task generation and success evaluation, a Graph Neural Network policy for action translation, and a Finite State Machine for environment reset. The system uses forward-reverse planning with a Last-In, First-Out causal sequence to restore workspaces and recover from failures. Evaluations show RADAR achieves high success rates in simulation and reliably executes diverse skills in the real world without domain-specific fine-tuning.
Forget expensive human-in-the-loop data collection: RADAR achieves up to 90% success rates on complex robotic tasks by autonomously generating its own training data through a closed-loop system.
The acquisition of large-scale physical interaction data, a critical prerequisite for modern robot learning, is severely bottlenecked by the prohibitive cost and scalability limits of human-in-the-loop collection paradigms. To break this barrier, we introduce Robust Autonomous Data Acquisition for Robotics (RADAR), a fully autonomous, closed-loop data generation engine that completely removes human intervention from the collection cycle. RADAR elegantly divides the cognitive load into a four-module pipeline. Anchored by 2-5 3D human demonstrations as geometric priors, a Vision-Language Model first orchestrates scene-relevant task generation via precise semantic object grounding and skill retrieval. Next, a Graph Neural Network policy translates these subtasks into physical actions via in-context imitation learning. Following execution, the VLM performs automated success evaluation using a structured Visual Question Answering pipeline. Finally, to shatter the bottleneck of manual resets, a Finite State Machine orchestrates an autonomous environment reset and asymmetric data routing mechanism. Driven by simultaneous forward-reverse planning with a strict Last-In, First-Out causal sequence, the system seamlessly restores unstructured workspaces and robustly recovers from execution failures. This continuous brain-cerebellum synergy transforms data collection into a self-sustaining process. Extensive evaluations highlight RADAR's exceptional versatility. In simulation, our framework achieves up to 90% success rates on complex, long-horizon tasks, effortlessly solving challenges where traditional baselines plummet to near-zero performance. In real-world deployments, the system reliably executes diverse, contact-rich skills (e.g., deformable object manipulation) via few-shot adaptation without domain-specific fine-tuning, providing a highly scalable paradigm for robotic data acquisition.