Search papers, labs, and topics across Lattice.
Independent Researcher
1
8
3
Despite their promise, today's best LLMs are alarmingly easy to jailbreak in medical contexts, raising serious concerns about their safe deployment as AI clinicians.