Search papers, labs, and topics across Lattice.
Beihang University
3
0
9
Text-to-image models can be tricked into generating images containing malicious text with over 90% success, even when standard jailbreak methods fail.
Binarizing weights and ternarizing activations in Transformers can deliver 16-24x kernel speedup and comparable accuracy to full-precision models, finally making ultra-low-bit quantization practical.
Industrial code generation gets a reasoning boost: InCoder-32B-Thinking leverages error-driven feedback and a code world model to achieve top-tier performance on complex hardware-aware tasks.