期刊論文

學年 114
學期 1
出版(發表)日期 2025-10-30
作品名稱 Replacing Batch Normalization with Memory-Based Affine Transformation for Test-Time Adaptation
作品名稱(其他語言)
著者 Jih Pin Yeh; Joe-Mei Feng; Hwei Jen Lin; Yoshimasa Tokuyama
單位
出版者
著錄名稱、卷期、頁數 Electronics 14(21), p.4251
摘要 Batch normalization (BN) has become a foundational component in modern deep neural networks. However, one of its disadvantages is its reliance on batch statistics that may be unreliable or unavailable during inference, particularly under test-time domain shifts. While batch-statistics-free affine transformation methods alleviate this by learning per-sample scale and shift parameters, most treat samples independently, overlooking temporal or sequential correlations in streaming or episodic test-time settings. We propose LSTM-Affine, a memory-based normalization module that replaces BN with a recurrent parameter generator. By leveraging an LSTM, the module produces channel-wise affine parameters conditioned on both the current input and its historical context, enabling gradual adaptation to evolving feature distributions. Unlike conventional batch-statistics-free designs, LSTM-Affine captures dependencies across consecutive samples, improving stability and convergence in scenarios with gradual distribution shifts. Extensive experiments on few-shot learning and source-free domain adaptation benchmarks demonstrate that LSTM-Affine consistently outperforms BN and prior batch-statistics-free baselines, particularly when adaptation data are scarce or non-stationary.
關鍵字 batch normalization;affine transformation;LSTM;test-time adaptation;memory-based learning;domain adaptation;few-shot learning;normalization-free networks;deep neural networks;feature distribution shift
語言 en
ISSN
期刊性質 國外
收錄於 SCI
產學合作
通訊作者
審稿制度
國別 CHE
公開徵稿
出版型式 ,電子版
相關連結

機構典藏連結 ( http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/128257 )