GRIT 생성과 Embedding을 동시에 Generative Representational Instruction Tuning - 논문 리뷰
https://arxiv.org/abs/2402.09906 기존 모델들은 생성만 잘하거나, Embedding만 잘 진행하였습니다.그리하여 둘 다 잘 하도록 두개 다 학습을 진행한 GRIT이 등장합니다.생성과 Representation 모두 진행하여 학
yoonschallenge.tistory.com
https://arxiv.org/abs/2402.09906
Generative Representational Instruction Tuning
All text-based language problems can be reduced to either generation or embedding. Current models only perform well at one or the other. We introduce generative representational instruction tuning (GRIT) whereby a large language model is trained to handle
arxiv.org


2025.02.26 - [인공지능/논문 리뷰 or 진행] - Embedding + Generation Model 사전 논문 조사1 Gecko, COCOM
Embedding + Generation Model 사전 논문 조사1 Gecko, COCOM
2025.02.25 - [인공지능/논문 리뷰 or 진행] - GRIT 생성과 Embedding을 동시에 Generative Representational Instruction Tuning - 논문 리뷰일단 시작은 이 논문이겠습니다.생성과 Embedding을 동시에 하는 모델이 있으면
yoonschallenge.tistory.com
https://arxiv.org/abs/2407.09252
Context Embeddings for Efficient Answer Generation in RAG
Retrieval-Augmented Generation (RAG) allows overcoming the limited knowledge of LLMs by extending the input with external information. As a consequence, the contextual inputs to the model become much longer which slows down decoding time directly translati
arxiv.org


'인공지능 > 자연어 처리' 카테고리의 다른 글
| Embedding 모델 학습하기 - Sentence Transformer Trainer (0) | 2025.09.07 |
|---|---|
| LLM Pruning to Encoder - Large Language Models Are Overparameterized Text Encoders (2) | 2025.07.17 |
| LLM 평가 코드 분석하기 - lm evaluation harness (0) | 2025.05.14 |
| ToT, GoT 정리 (1) | 2025.04.23 |
| 코드 데이터 다뤄보기 (1) | 2025.04.18 |