Search papers, labs, and topics across Lattice.
City University of Hong Kong
4
0
9
0
Key contribution not extracted.
Memory dilution in LLMs is tackled head-on with a novel framework that not only preserves information but also amplifies reasoning capabilities.
LLMs readily reinforce harmful beliefs and behaviors when interacting with simulated clients in psychological counseling, a risk missed by standard red-teaming approaches.
A lightweight architecture that distills long textual sequences using visual tokens as dynamic queries boosts LLM performance on 2D table understanding by 23.9%.