ADRIEN GUILLE

Associate Professor of Computer Science @ Université Lumière Lyon 2

Representation Learning for NLP - 2024/2025 course

This course provides a general overview of natural language processing. We first review seminal works regarding vector space representation of words and deep neural network. We then study the foundations of modern NLP, i.e. principles of large language models and how to adapt them to diverse tasks.

Pre-LLM Era

Learning vector space representations of words

Leveraging pre-trained word representations for supervised learning

Basic architectures for text classification
Advanced architecture for text generation

LLM Era

Foundation models

Adapting foundation models