Рет қаралды 19
주제: Background I (GPT: Generative Pre-trained Transformer)
세부 주제:
[1] Radford, A. (2018). Improving language understanding by generative pre-training.
[2] Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., & Sutskever, I. (2019). Language models are unsupervised multitask learners. OpenAI blog, 1(8), 9.
Korea University Smart Production Systems Lab. (sps.korea.ac.kr)