Search papers, labs, and topics across Lattice.
2
0
5
Don't count on unembedding matrix geometry to predict language model performance—it's more a reflection of training hyperparameters than inherent capabilities.
Despite growing interest, queer NLP research remains largely reactive, highlighting biases instead of building proactive solutions, leaving significant opportunities for stakeholder-driven and intersectional approaches.