Relation extraction (RE) is vital in natural language processing (NLP) for Analyzing the relationships among entities within unstructured text, supporting applications like knowledge graphs and question-answering systems. Existing methods often utilize graph neural networks (GNNs) and pretrained language models such as BERT, but they struggle with long-range dependencies and class imbalance in sparse relationships. In this paper, we introduce AxU-Doc, an innovative model that leverages axial attention within a U-shaped architecture to effectively acquire comprehensive information and promote l...