Close Icon
Scientific-paper

Multi-Label Text Classification using Attention-Based Graph Neural Network

Improving Natural Language Processing

Multi-Label Text Classification (MLTC), through which one or more labels are assigned to each input sample, is essential for effective Natural Language Processing (NLP). However, most MLTC tasks include dependencies or correlations among labels that traditional classification methods overlook.

This paper shows how an attention-based graph neural network can capture these dependencies for better results. Validation across five real-world MLTC datasets reveals that the proposed model achieves consistently higher accuracy than conventional approaches.

Ankit Pal, Muru Selvakumar, and Malaikannan Sankarasubbu conducted this research for the Saama AI Research team.

  • Download Paper

  • You may unsubscribe from these communications at any time. For more information on how to unsubscribe, our privacy practices, and how we are committed to protecting and respecting your privacy, please review our Privacy Policy.