인공지능/자연어 처리

Generation + Embedding 세미나 준비

이게될까 2025. 6. 12. 22:58
728x90
728x90

 

2025.02.25 - [인공지능/논문 리뷰 or 진행] - GRIT 생성과 Embedding을 동시에 Generative Representational Instruction Tuning - 논문 리뷰

 

GRIT 생성과 Embedding을 동시에 Generative Representational Instruction Tuning - 논문 리뷰

https://arxiv.org/abs/2402.09906 기존 모델들은 생성만 잘하거나, Embedding만 잘 진행하였습니다.그리하여 둘 다 잘 하도록 두개 다 학습을 진행한 GRIT이 등장합니다.생성과 Representation 모두 진행하여 학

yoonschallenge.tistory.com

 

https://arxiv.org/abs/2402.09906

 

Generative Representational Instruction Tuning

All text-based language problems can be reduced to either generation or embedding. Current models only perform well at one or the other. We introduce generative representational instruction tuning (GRIT) whereby a large language model is trained to handle

arxiv.org

 

 

 

2025.02.26 - [인공지능/논문 리뷰 or 진행] - Embedding + Generation Model 사전 논문 조사1 Gecko, COCOM

 

Embedding + Generation Model 사전 논문 조사1 Gecko, COCOM

2025.02.25 - [인공지능/논문 리뷰 or 진행] - GRIT 생성과 Embedding을 동시에 Generative Representational Instruction Tuning - 논문 리뷰일단 시작은 이 논문이겠습니다.생성과 Embedding을 동시에 하는 모델이 있으면

yoonschallenge.tistory.com

 

 

 

https://arxiv.org/abs/2407.09252

 

Context Embeddings for Efficient Answer Generation in RAG

Retrieval-Augmented Generation (RAG) allows overcoming the limited knowledge of LLMs by extending the input with external information. As a consequence, the contextual inputs to the model become much longer which slows down decoding time directly translati

arxiv.org

 

 

 

728x90