Noh, K., Kwak, H., Son, J., Kim, S., Um, M., Kang, M., . . . Kim, S. (2024). Retention-aware zero-shifting technique for Tiki-Taka algorithm-based analog deep learning accelerator. Science Advances, 10(24), eadl3350. https://doi.org/10.1126/sciadv.adl3350
Chicago Style (17th ed.) CitationNoh, Kyungmi, et al. "Retention-aware Zero-shifting Technique for Tiki-Taka Algorithm-based Analog Deep Learning Accelerator." Science Advances 10, no. 24 (2024): eadl3350. https://doi.org/10.1126/sciadv.adl3350.
MLA (9th ed.) CitationNoh, Kyungmi, et al. "Retention-aware Zero-shifting Technique for Tiki-Taka Algorithm-based Analog Deep Learning Accelerator." Science Advances, vol. 10, no. 24, 2024, p. eadl3350, https://doi.org/10.1126/sciadv.adl3350.
Warning: These citations may not always be 100% accurate.