In 2018, Generative Pre-trained Transformers (GPT, by OpenAI) and Bidirectional Encoder Representations from Transformers (BERT, by Google) are introduced.
Radford, A. et al (2018). Improving language understanding by generative pre-training, https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf
J. Devlin et al (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, ACL 2019, https://aclanthology.org/N19-1423
#HistoryOfAI #ISE2024 #AI #llm @fizise @enorouzi @sourisnumerique
