-
훈민정음에 스며들다 - 대화 요약 (역량 평가 및 예선)
-
KLUE - Machine Reading Comprehension (ODQA)
-
허깅 페이스 Trainer를 이용해 학습 중 샘플 출력해보기 (QA task)
-
Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks
-
QA with Phrase Retrieval
-
Closed-book QA with T5
-
Reducing Training Bias
-
Linking MRC and Retrieval
-
Scaling up with FAISS
-
Custom data를 이용해 WordPiece 방식으로 vocab 생성 및 BertTokenizer 훈련하기
-
Passage Retrieval - Dense Embedding
-
Passage Retrieval - Sparse Embedding
-
Generation-based MRC
-
Extraction-based MRC
-
MRC Intro & Python Basics
-
Hugging face - QA model classification head 분석해보기
-
KLUE - Relation Extraction
-
GPT - applications
-
Entity를 변형해 데이터 Augmentation 하기
-
BERT vs RoBERTa vs ELECTRA
-
Hugging face - Custom Loss
-
BERT 기반 두 문장의 관계 분류 (한국어)
-
BERT 언어모델 소개 및 학습 with 한국어
-
자연어의 전처리
-
인공지능과 자연어처리
-
A method of relation extraction using pre-training models
-
Huggingface- KLUE- 관계 추출
-
Huggingface- Text Classification
-
Huggingface- Chapter 4. Sharing models and tokenizers
-
Huggingface- Chapter 3. Fine-tunning
-
Huggingface- Chapter 2. Pretrained model & tokenizer
-
Huggingface- Chapter 1. pipeline, theory
-
NLP: GPT-2/3, ALBERT, ELECTRA
-
Byte Pair Encoding
-
NLP: GPT-1, BERT
-
Transformer - Multi head attention
-
Transformer - Self attention
-
Beam search and BLEU score
-
Seq2Seq with attention
-
Recurrent Neural Network - NLP
-
NLP word embedding
-
NLP basic
ML/AI/SW Developer