On block g-circulant matrices with discrete cosine and sine transforms for transformer-based translation machine

Euis Asriani, S.Si., M.Si., - and Intan Muchtadi-Alamsyah, - and Ayu Purwarianti, - On block g-circulant matrices with discrete cosine and sine transforms for transformer-based translation machine. MDPI, 12 (11). p. 2014.

[thumbnail of (FULL TEXT)] Text ((FULL TEXT))
mathematics-12-01697-v3.pdf

Download (630kB)

Abstract

Transformer has emerged as one of the modern neural networks that has been applied in numerous applications. However, transformers’ large and deep architecture makes them computationally and memory-intensive. In this paper, we propose the block g-circulant matrices to replace the dense weight matrices in the feedforward layers of the transformer and leverage the DCT-DST algorithm to multiply these matrices with the input vector. Our test using Portuguese-English datasets shows that the suggested method improves model memory efficiency compared to the dense transformer but at the cost of a slight drop in accuracy. We found that the model Dense-block 1-circulant DCT-DST of 128 dimensions achieved the highest model memory efficiency at 22.14%. We further show that the same model achieved a BLEU score of 26.47%.

Item Type: Article
Uncontrolled Keywords: transformer; block g-circulant matrices; DCT-DST algorithm; kronecker product
Subjects: Q Sains > QA Mathematics
Divisions: KARYA TULIS DOSEN
Depositing User: UPT Perpustakaan UBB
Date Deposited: 05 Feb 2026 07:02
Last Modified: 05 Feb 2026 07:02
URI: https://repository.ubb.ac.id/id/eprint/13171

Actions (login required)

View Item View Item