Chalkidis, I., Dai, X., Fergadiotis, M., Malakasiotis, P., & Elliott, D. (2022). An Exploration of Hierarchical Attention Transformers for Efficient Long Document Classification. https://doi.org/10.48550/arxiv.2210.05529
Chicago Style (17th ed.) CitationChalkidis, Ilias, Xiang Dai, Manos Fergadiotis, Prodromos Malakasiotis, and Desmond Elliott. An Exploration of Hierarchical Attention Transformers for Efficient Long Document Classification. 2022. https://doi.org/10.48550/arxiv.2210.05529.
MLA (9th ed.) CitationChalkidis, Ilias, et al. An Exploration of Hierarchical Attention Transformers for Efficient Long Document Classification. 2022. https://doi.org/10.48550/arxiv.2210.05529.
Warning: These citations may not always be 100% accurate.