APA (7th ed.) Citation

Chalkidis, I., Dai, X., Fergadiotis, M., Malakasiotis, P., & Elliott, D. (2022). An Exploration of Hierarchical Attention Transformers for Efficient Long Document Classification. https://doi.org/10.48550/arxiv.2210.05529

Chicago Style (17th ed.) Citation

Chalkidis, Ilias, Xiang Dai, Manos Fergadiotis, Prodromos Malakasiotis, and Desmond Elliott. An Exploration of Hierarchical Attention Transformers for Efficient Long Document Classification. 2022. https://doi.org/10.48550/arxiv.2210.05529.

MLA (9th ed.) Citation

Chalkidis, Ilias, et al. An Exploration of Hierarchical Attention Transformers for Efficient Long Document Classification. 2022. https://doi.org/10.48550/arxiv.2210.05529.

Warning: These citations may not always be 100% accurate.