MeTa Learning-Based Optimization of Unsupervised Domain Adaptation Deep Networks
學年 113
學期 1
出版(發表)日期 2025-01-10
作品名稱 MeTa Learning-Based Optimization of Unsupervised Domain Adaptation Deep Networks
作品名稱(其他語言)
著者 Hsiau-Wen Lin, Trang-Thi Ho, Ching-Ting Tu, Hwei Jen Lin, Chen-Hsiang Yu
單位
出版者
著錄名稱、卷期、頁數 Mathematics, 13(2):226
摘要 This paper introduces a novel unsupervised domain adaptation (UDA) method, MeTa Discriminative Class-Wise MMD (MCWMMD), which combines meta-learning with a Class-Wise Maximum Mean Discrepancy (MMD) approach to enhance domain adaptation. Traditional MMD methods align overall distributions but struggle with classwise alignment, reducing feature distinguishability. MCWMMD incorporates a metamodule to dynamically learn a deep kernel for MMD, improving alignment accuracy and model adaptability. This meta-learning technique enhances the model’s ability to generalize across tasks by ensuring domain-invariant and class-discriminative feature representations. Despite the complexity of the method, including the need for meta-module training, it presents a significant advancement in UDA. Future work will explore scalability in diverse real-world scenarios and further optimize the meta-learning framework. MCWMMD offers a promising solution to the persistent challenge of domain adaptation, paving the way for more adaptable and generalizable deep learning models.
關鍵字 unsupervised domain adaptation; maximum mean discrepancy (MMD); discriminative class-wise MMD (DCWMMD); meta-learning; deep kernel; feature distributions; domain shift; transfer learning
語言 en_US
ISSN 2227-7390
期刊性質 國內
收錄於 SCI EI
產學合作
通訊作者
審稿制度
國別 CHE
公開徵稿
出版型式 ,電子版