[논문 리딩] Do Pre-trained Models Benefit Knowledge Graph Completion? A Reliable Evaluation and a Reasonable Approach
Do Pre-trained Models Benefit Knowledge Graph Completion? A Reliable Evaluation and a Reasonable Approach키워드CompletionGraphKGKGCLLMyear2022저자Xin Lv VenueACL Findings 2022MemoPKGC. PLM의 입력으로 triple prompt + support prompt 사용.분류연구DONE생성 일시@2023년 11월 21일 오후 3:13최종 편집 일시@2023년 11월 27일 오전 3:10Working@inproceedings{Lv2022DoPM, title={Do Pre-trained Models Benefit Knowledge Graph Completion? A Reliable..