Search papers, labs, and topics across Lattice.
This paper introduces Contrastive learning approach for Attributed Hypergraph Clustering (CAHC), an end-to-end method for attributed hypergraph clustering that addresses the lack of direct clustering supervision in existing contrastive learning methods. CAHC employs a novel contrastive learning approach with node-level and hyperedge-level objectives for representation learning, followed by joint embedding and clustering optimization for cluster assignment. Experiments on eight datasets demonstrate that CAHC outperforms existing baselines.
End-to-end attributed hypergraph clustering via contrastive learning achieves state-of-the-art results by jointly optimizing node embeddings and cluster assignments.
Contrastive learning has demonstrated strong performance in attributed hypergraph clustering. Typically, existing methods based on contrastive learning first learn node embeddings and then apply clustering algorithms, such as k-means, to these embeddings to obtain the clustering results.However, these methods lack direct clustering supervision, risking the inclusion of clustering-irrelevant information in the learned graph.To this end, we propose a Contrastive learning approach for Attributed Hypergraph Clustering (CAHC), an end-to-end method that simultaneously learns node embeddings and obtains clustering results. CAHC consists of two main steps: representation learning and cluster assignment learning. The former employs a novel contrastive learning approach that incorporates both node-level and hyperedge-level objectives to generate node embeddings.The latter joint embedding and clustering optimization to refine these embeddings by clustering-oriented guidance and obtains clustering results simultaneously.Extensive experimental results demonstrate that CAHC outperforms baselines on eight datasets.