전체 글

싱싱한 자연어를 탐구합니다.
Continual Learning의 목표Avoid Catastrophic Forgetting이전 task의 기억을 보전해야 함Positive Forward Transfer이전 task에서 학습했던 지식이 다음 task에 도움이 되어야 함Positive Backward Transfer다음 task에서 학습을 한 지식이 이전 task의 성능 향상에도 도움이 되어야 함Task-Order Free LearningTask의 학습 순서와 무관하게 모든 task를 잘 수행해야 함 Forward TransferForward transfer는 모델이 이전에 학습한 task의 지식을 활용하여 새로운 task에 대한 학습 효율과 성능을 향상시키는 능력을 말한다.Continual Learning에서 forward transf..
· Paper Review
Advancing continual lifelong learning in neural information retrieval: definition, dataset, framework, and empirical evaluationPublication Info: Information Sciences 2025URL: https://www.sciencedirect.com/science/article/pii/S0020025524012829ContributionIR task의 맥락에서 Continual Learning 패러다임에 대해 명확히 정의함Continual IR을 평가하기 위해, Topic-MS-MARCO 데이터셋을 제안함주제별 IR task와 predefined task similarity가 포함CLNIR..
· Paper Review
Dense Retrieval Adaptation using Target Domain DescriptionCited by 3 ('2024-10-22)Publication Info: ACM ICTIR 2023URL: https://arxiv.org/abs/2307.02740 Dense Retrieval Adaptation using Target Domain DescriptionIn information retrieval (IR), domain adaptation is the process of adapting a retrieval model to a new domain whose data distribution is different from the source domain. Existing methods ..
· Paper Review
GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense RetrievalCited by 142 (’2024-10-22)Publication Info: NAACL 2022URL: https://aclanthology.org/2022.naacl-main.168 GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense RetrievalKexin Wang, Nandan Thakur, Nils Reimers, Iryna Gurevych. Proceedings of the 2022 Conference of the North American Chapter of..
· Paper Review
Continual Learning of Long Topic Sequences in Neural Information RetrievalCited by 6 ('2024-10-22)Publication Info: ECIR 2022URL: https://arxiv.org/abs/2201.03356 Continual Learning of Long Topic Sequences in Neural Information RetrievalIn information retrieval (IR) systems, trends and users' interests may change over time, altering either the distribution of requests or contents to be recommend..
· Paper Review
Cited by 23 ('2024-10-22)Publication Info: ECIR 2021URL: https://arxiv.org/abs/2101.06984 Studying Catastrophic Forgetting in Neural Ranking ModelsSeveral deep neural ranking models have been proposed in the recent IR literature. While their transferability to one target domain held by a dataset has been widely addressed using traditional domain adaptation strategies, the question of their cross..
oneonlee
One Only